the pentaho reports does not print arabic characters when run report for pdf. I have made the changes as below org.pentaho.reporting.engine.classic.core.modules.output.table.html.Encoding=UTF-8 org.pentaho.reporting.engine.classic.core.modules.output.pageable.pdf.Encoding=IDENTITY-H in biserver-ce/tomcat/webapps/reports/WEB-INF/classes/classic-engine.properties file. still facing the issue. Could anybody please help me . I am using pentaho 4.5.0 version
↧
pdf view does not print arabic characters
↧
starting value of sequence
Hi, I created a sequence and I want it to start from a value that I select from a table can u please tell me how to do so.
↧
↧
Dynamic value for add sequence
Can we pass any dynamic value (which is the max value of another table column) in "Start at Value" in ADD Sequence step.
Thank you.
Thank you.
↧
Impossible to send any informations with HTTP POST
Hello everybody :)
I would post a xml file to website (SSL) with POST method. I follow this documentation :
http://wiki.pentaho.com/display/EAI/HTTP+Post
But, it's impossible to send any informations. The transformation is executed without error, but nothing information is received by the server (checked with sniffer),
I tested many combinaison, but no error (the documentation show "will fail silently") and no packet sended...
How solved my problem ? :eek:
Technical informations :
General Tab :
URL -> https//boutique.[hidden]/api/customers
Encoding -> UTF-8
Request entity field -> ${Internal.Job.Filename.Directory}\xml\addcustomer.xml
Post a file -> checked
Result fieldname -> http_result (the result of result is empty)
HTTP status code fieldname -> http_status (the result of status is empty also)
HTTP login -> [hidden]
Fields tab: Body (Header) Parameters
Fields - Parameter - Header
Content-Type - text/xml - Y
Pentaho version : 4.2.1
log file :
2015/03/31 12:13:29 - Pentaho Data Integration - PREVISUALISATION!
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation a été pré-chargée depuis le référentiel.
2015/03/31 12:13:29 - Send_to_PrestaShop - nombre détapes à éxécuter : {0} , nombre de liens : {1}
2015/03/31 12:13:29 - Send_to_PrestaShop - Distribution démarrée pour la tranformation [Send_to_PrestaShop]
2015/03/31 12:13:29 - Send_to_PrestaShop - Nombre d'arguments détectés:0
2015/03/31 12:13:29 - Send_to_PrestaShop - Le mode contrôle renforcé est disponible pour cette transformation.
2015/03/31 12:13:29 - Send_to_PrestaShop - Ce n'est pas une ré exécution de la transformation
2015/03/31 12:13:29 - Send_to_PrestaShop - 2 étapes différentes vont êtres chargées.
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation plages de lignes...
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation plages de lignes pour l'étape 0 --> send xml to PrestaShop
2015/03/31 12:13:29 - Send_to_PrestaShop - copies précédentes = 1, copies suivantes=1
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation a alloué de nouvelle plage de lignes [send xml to PrestaShop.0 - Alimentation fichier.0]
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation 1 plages de lignes pour l'étape 0 --> send xml to PrestaShop
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation plages de lignes pour l'étape 1 --> Alimentation fichier
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation 1 plages de lignes pour l'étape 1 --> Alimentation fichier
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation des étapes & données...
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation est sur le point d'allouer à l'étape [send xml to PrestaShop] type [HTTPPOST]
2015/03/31 12:13:29 - Send_to_PrestaShop - Nr copie de l'étape =1
2015/03/31 12:13:29 - send xml to PrestaShop.0 - distribution activée
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Démarrage allocation buffers & nouveaux threads...
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Step info: nrentrée=0 nrsortie=1
2015/03/31 12:13:29 - send xml to PrestaShop.0 - La relation en sortie est 1:1
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Lignes en sortie trouvées [send xml to PrestaShop.0 - Alimentation fichier.0]
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Fin distribution
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation a alloué une nouvelle étape: [send xml to PrestaShop].0
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation est sur le point d'allouer à l'étape [Alimentation fichier] type [TextFileOutput]
2015/03/31 12:13:29 - Send_to_PrestaShop - Nr copie de l'étape =1
2015/03/31 12:13:29 - Alimentation fichier.0 - distribution activée
2015/03/31 12:13:29 - Alimentation fichier.0 - Démarrage allocation buffers & nouveaux threads...
2015/03/31 12:13:29 - Alimentation fichier.0 - Step info: nrentrée=1 nrsortie=0
2015/03/31 12:13:29 - Alimentation fichier.0 - Étape précédente récupérée depuis [Alimentation fichier] #0 --> send xml to PrestaShop
2015/03/31 12:13:29 - Alimentation fichier.0 - La relation en entrée est 1:1
2015/03/31 12:13:29 - Alimentation fichier.0 - Jeux d'enregistrement en entrée trouvé [send xml to PrestaShop.0 - Alimentation fichier.0]
2015/03/31 12:13:29 - Alimentation fichier.0 - Fin distribution
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation a alloué une nouvelle étape: [Alimentation fichier].0
2015/03/31 12:13:29 - Send_to_PrestaShop - Cette transformation ne peut être ré-éxécutée avec la date de ré-exécution: 2015/03/31 12:13:29
2015/03/31 12:13:29 - Send_to_PrestaShop - Initialisation de 2 étape(s)...
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Running on slave server #0/1.
2015/03/31 12:13:29 - Alimentation fichier.0 - Running on slave server #0/1.
2015/03/31 12:13:29 - Alimentation fichier.0 - Le répertoire parent [file:///L:/Prestashop/Clients] existe.
2015/03/31 12:13:29 - Alimentation fichier.0 - Opening output stream in nocompress mode
2015/03/31 12:13:29 - Alimentation fichier.0 - Opening output stream in default encoding
2015/03/31 12:13:29 - Alimentation fichier.0 - Opened new file with name [L:\Prestashop\Clients\file.txt]
2015/03/31 12:13:29 - Send_to_PrestaShop - L'étape [send xml to PrestaShop.0] a été initialisée sans erreur.
2015/03/31 12:13:29 - Send_to_PrestaShop - L'étape [Alimentation fichier.0] a été initialisée sans erreur.
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Démarrage...
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Traitement effectué pour 1 lignes.
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Fin exécution étape (Entrées=0, Sorties=0, Lues=0, Ecrites=0, Maj=0, Erreurs=0)
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation a alloué 2 exétrons et 1 plages de lignes.
2015/03/31 12:13:29 - Alimentation fichier.0 - Démarrage...
2015/03/31 12:13:29 - Alimentation fichier.0 - Traitement effectué pour 0 lignes.
2015/03/31 12:13:29 - Alimentation fichier.0 - Closing output stream
2015/03/31 12:13:29 - Alimentation fichier.0 - Closed output stream
2015/03/31 12:13:29 - Alimentation fichier.0 - Closing normal file ...
2015/03/31 12:13:29 - Alimentation fichier.0 - Fin exécution étape (Entrées=0, Sorties=0, Lues=0, Ecrites=0, Maj=0, Erreurs=0)
2015/03/31 12:13:29 - Pentaho Data Integration - L'exécution de la transformation a été achevée!
I would post a xml file to website (SSL) with POST method. I follow this documentation :
http://wiki.pentaho.com/display/EAI/HTTP+Post
But, it's impossible to send any informations. The transformation is executed without error, but nothing information is received by the server (checked with sniffer),
I tested many combinaison, but no error (the documentation show "will fail silently") and no packet sended...
How solved my problem ? :eek:
Technical informations :
General Tab :
URL -> https//boutique.[hidden]/api/customers
Encoding -> UTF-8
Request entity field -> ${Internal.Job.Filename.Directory}\xml\addcustomer.xml
Post a file -> checked
Result fieldname -> http_result (the result of result is empty)
HTTP status code fieldname -> http_status (the result of status is empty also)
HTTP login -> [hidden]
Fields tab: Body (Header) Parameters
Fields - Parameter - Header
Content-Type - text/xml - Y
Pentaho version : 4.2.1
log file :
Quote:
2015/03/31 12:13:29 - Pentaho Data Integration - PREVISUALISATION!
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation a été pré-chargée depuis le référentiel.
2015/03/31 12:13:29 - Send_to_PrestaShop - nombre détapes à éxécuter : {0} , nombre de liens : {1}
2015/03/31 12:13:29 - Send_to_PrestaShop - Distribution démarrée pour la tranformation [Send_to_PrestaShop]
2015/03/31 12:13:29 - Send_to_PrestaShop - Nombre d'arguments détectés:0
2015/03/31 12:13:29 - Send_to_PrestaShop - Le mode contrôle renforcé est disponible pour cette transformation.
2015/03/31 12:13:29 - Send_to_PrestaShop - Ce n'est pas une ré exécution de la transformation
2015/03/31 12:13:29 - Send_to_PrestaShop - 2 étapes différentes vont êtres chargées.
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation plages de lignes...
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation plages de lignes pour l'étape 0 --> send xml to PrestaShop
2015/03/31 12:13:29 - Send_to_PrestaShop - copies précédentes = 1, copies suivantes=1
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation a alloué de nouvelle plage de lignes [send xml to PrestaShop.0 - Alimentation fichier.0]
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation 1 plages de lignes pour l'étape 0 --> send xml to PrestaShop
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation plages de lignes pour l'étape 1 --> Alimentation fichier
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation 1 plages de lignes pour l'étape 1 --> Alimentation fichier
2015/03/31 12:13:29 - Send_to_PrestaShop - Allocation des étapes & données...
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation est sur le point d'allouer à l'étape [send xml to PrestaShop] type [HTTPPOST]
2015/03/31 12:13:29 - Send_to_PrestaShop - Nr copie de l'étape =1
2015/03/31 12:13:29 - send xml to PrestaShop.0 - distribution activée
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Démarrage allocation buffers & nouveaux threads...
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Step info: nrentrée=0 nrsortie=1
2015/03/31 12:13:29 - send xml to PrestaShop.0 - La relation en sortie est 1:1
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Lignes en sortie trouvées [send xml to PrestaShop.0 - Alimentation fichier.0]
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Fin distribution
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation a alloué une nouvelle étape: [send xml to PrestaShop].0
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation est sur le point d'allouer à l'étape [Alimentation fichier] type [TextFileOutput]
2015/03/31 12:13:29 - Send_to_PrestaShop - Nr copie de l'étape =1
2015/03/31 12:13:29 - Alimentation fichier.0 - distribution activée
2015/03/31 12:13:29 - Alimentation fichier.0 - Démarrage allocation buffers & nouveaux threads...
2015/03/31 12:13:29 - Alimentation fichier.0 - Step info: nrentrée=1 nrsortie=0
2015/03/31 12:13:29 - Alimentation fichier.0 - Étape précédente récupérée depuis [Alimentation fichier] #0 --> send xml to PrestaShop
2015/03/31 12:13:29 - Alimentation fichier.0 - La relation en entrée est 1:1
2015/03/31 12:13:29 - Alimentation fichier.0 - Jeux d'enregistrement en entrée trouvé [send xml to PrestaShop.0 - Alimentation fichier.0]
2015/03/31 12:13:29 - Alimentation fichier.0 - Fin distribution
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation a alloué une nouvelle étape: [Alimentation fichier].0
2015/03/31 12:13:29 - Send_to_PrestaShop - Cette transformation ne peut être ré-éxécutée avec la date de ré-exécution: 2015/03/31 12:13:29
2015/03/31 12:13:29 - Send_to_PrestaShop - Initialisation de 2 étape(s)...
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Running on slave server #0/1.
2015/03/31 12:13:29 - Alimentation fichier.0 - Running on slave server #0/1.
2015/03/31 12:13:29 - Alimentation fichier.0 - Le répertoire parent [file:///L:/Prestashop/Clients] existe.
2015/03/31 12:13:29 - Alimentation fichier.0 - Opening output stream in nocompress mode
2015/03/31 12:13:29 - Alimentation fichier.0 - Opening output stream in default encoding
2015/03/31 12:13:29 - Alimentation fichier.0 - Opened new file with name [L:\Prestashop\Clients\file.txt]
2015/03/31 12:13:29 - Send_to_PrestaShop - L'étape [send xml to PrestaShop.0] a été initialisée sans erreur.
2015/03/31 12:13:29 - Send_to_PrestaShop - L'étape [Alimentation fichier.0] a été initialisée sans erreur.
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Démarrage...
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Traitement effectué pour 1 lignes.
2015/03/31 12:13:29 - send xml to PrestaShop.0 - Fin exécution étape (Entrées=0, Sorties=0, Lues=0, Ecrites=0, Maj=0, Erreurs=0)
2015/03/31 12:13:29 - Send_to_PrestaShop - La transformation a alloué 2 exétrons et 1 plages de lignes.
2015/03/31 12:13:29 - Alimentation fichier.0 - Démarrage...
2015/03/31 12:13:29 - Alimentation fichier.0 - Traitement effectué pour 0 lignes.
2015/03/31 12:13:29 - Alimentation fichier.0 - Closing output stream
2015/03/31 12:13:29 - Alimentation fichier.0 - Closed output stream
2015/03/31 12:13:29 - Alimentation fichier.0 - Closing normal file ...
2015/03/31 12:13:29 - Alimentation fichier.0 - Fin exécution étape (Entrées=0, Sorties=0, Lues=0, Ecrites=0, Maj=0, Erreurs=0)
2015/03/31 12:13:29 - Pentaho Data Integration - L'exécution de la transformation a été achevée!
↧
Unable to launch Spoon
Hi,
I downloaded pdi-ce-5.3.0.0-213.zip. I set the pentaho_java_home and pentaho_java environmental variables in win 7 64bit. When I try to launch I am getting the following error
---------------------------
Java Virtual Machine Launcher
---------------------------
Error: Unable to access jarfile launcher\pentaho-application-launcher-5.3.0.0-213.jar
---------------------------
OK
---------------------------
G:\PDI>start "Spoon" "C:\Program Files\Java\jre1.8.0_31\bin\javaw.exe" "-Xmx512
m" "-XX:MaxPermSize=256m" "-Djava.library.path=libswt\win64" "-DKETTLE_HOME=" "-
DKETTLE_REPOSITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACK
AGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\pentaho-ap
plication-launcher-5.3.0.0-213.jar -lib ..\libswt\win64
Where am I going wrong?
Regards,
Subramanian S.
I downloaded pdi-ce-5.3.0.0-213.zip. I set the pentaho_java_home and pentaho_java environmental variables in win 7 64bit. When I try to launch I am getting the following error
---------------------------
Java Virtual Machine Launcher
---------------------------
Error: Unable to access jarfile launcher\pentaho-application-launcher-5.3.0.0-213.jar
---------------------------
OK
---------------------------
G:\PDI>start "Spoon" "C:\Program Files\Java\jre1.8.0_31\bin\javaw.exe" "-Xmx512
m" "-XX:MaxPermSize=256m" "-Djava.library.path=libswt\win64" "-DKETTLE_HOME=" "-
DKETTLE_REPOSITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACK
AGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\pentaho-ap
plication-launcher-5.3.0.0-213.jar -lib ..\libswt\win64
Where am I going wrong?
Regards,
Subramanian S.
↧
↧
Kettle Error Message
Hi All.
I'm getting a parsing error from Kettle concerning my Json input file.
I know it sounds weird, but Kettle gives you the position of the character it doesn't like in the file.
The first error message I got was a concern over character 1798! I mean, come on guys! I don't
have a tool that finds that for me. Now I've got another parsing error referring again to a character's
position in the file. My file is thousands and thousands of characters long and it's a short one compared
to more of our input.
Does Kettle have a tool that will "find nth position" in file, because this is really, really...well...you know.
Thanks in advance.
T.
I'm getting a parsing error from Kettle concerning my Json input file.
I know it sounds weird, but Kettle gives you the position of the character it doesn't like in the file.
The first error message I got was a concern over character 1798! I mean, come on guys! I don't
have a tool that finds that for me. Now I've got another parsing error referring again to a character's
position in the file. My file is thousands and thousands of characters long and it's a short one compared
to more of our input.
Does Kettle have a tool that will "find nth position" in file, because this is really, really...well...you know.
Thanks in advance.
T.
↧
Solution consultant certification
Hi guys,
I'm going to do the Pentaho solution consultant certification and I ask you some information regarding this certification.
What kind of questions are in the examination?
Xaction are part of the program?
Any useful information is really appreciated like study guide, material and any tip!
Regards
Nico
I'm going to do the Pentaho solution consultant certification and I ask you some information regarding this certification.
What kind of questions are in the examination?
Xaction are part of the program?
Any useful information is really appreciated like study guide, material and any tip!
Regards
Nico
↧
Resulting value is not a tablemodel
I'm running into an error previewing results for my scriptable resultset. Outside of modifying the connection method, the code compiled (and ran) successfully using NetBeans. I've been operating with the understanding that plain old java runs unmodified in BeanShell. Any tips?
Code:
import javax.sql.DataSource;
import java.sql.Connection;
import java.sql.Statement;
import java.sql.ResultSet;
import java.math.BigDecimal;
import java.sql.DriverManager;
import java.sql.SQLException;
import javax.sql.RowSet;
import java.sql.PreparedStatement;
/* PRD Only */
import org.pentaho.reporting.engine.classic.core.util.TypedTableModel;
import javax.naming.InitialContext;
public class JDBCReport {
public static void main(String[] args) throws SQLException, ClassNotFoundException, Exception {
InitialContext cxt = new InitialContext();
//DataSource ds = (DataSource) cxt.lookup("java:/comp/env/jdbc/aim");
DataSource ds = (DataSource) cxt.lookup("aim");
Connection c = ds.getConnection();
Statement stmt = c.createStatement();
String query = ""; //Snipped
ResultSet rs = stmt.executeQuery(query);
/* Pentaho Centric. */
String [] columnNames = new String [] {"Col1", "Col2", "Col3", "Col4", "Col5", "Col6", "Col7", "Col8", "Col9", "Col10", "Col11", "Col12"};
Class [] columnTypes = new Class [] {String.class, String.class, String.class, String.class, String.class, String.class, String.class, String.class, String.class, String.class, String.class, String.class};
TypedTableModel model = new TypedTableModel(columnNames, columnTypes);
while(rs.next()){
if ( getCnt(rs.getString(13)) ) {
model.addRow(new Object [] {rs.getString(1),rs.getString(2),rs.getString(3),rs.getString(4),rs.getString(5),rs.getString(6),rs.getString(7),rs.getString(8),rs.getString(9),rs.getString(10),rs.getString(11),rs.getString(12) });
} else {
// Go to the next record.
continue;
}
}
c.close();
return model;
} // End main
static boolean getCnt(String value) throws SQLException, Exception {
int rowCnt =-1;
String sql = ""; //Snipped
Connection c = ds.getConnection();
PreparedStatement preparedStatement = c.prepareStatement(sql);
preparedStatement.setString(1,value);
ResultSet rs = preparedStatement.executeQuery();
if (rs.next()) { rowCnt = rs.getInt(1); } // Store the value in the rowCnt variable.
if ( rowCnt >1 ) {
return true;
} else {
return false;
}
c.close();
} // End getCnt method
} // End JDBCReport class
↧
Bar chart All in one - Bar & line with same scale
Hi everyone! I'm doing a bar chart with lines, but i need that the bar and the line have the same scale
I'm working whit the last version of CDE
Thnaks and regards
I'm working whit the last version of CDE
Thnaks and regards
↧
↧
Degenerate dimensions in Mondrian 4
Hi everyone.
I'm trying to implement a couple of degenerate dimensions on Mondrian 4 but haven't been able to do it. The docs still have the "old" way.
This is my Mondrian 3 schema that works fine:
But when I try to do something similar on 4 it gave me this error:
mondrian.rolap.RolapSchema$MondrianSchemaException: table must be specified (in Attribute 'City') (at line 0, column 215)
at mondrian.rolap.RolapSchemaLoaderHandlerImpl.error(RolapSchemaLoaderHandlerImpl.java:117)
at mondrian.rolap.RolapSchemaLoader.createColumnList(RolapSchemaLoader.java:3999)
at mondrian.rolap.RolapSchemaLoader.createAttribute(RolapSchemaLoader.java:3397)
at mondrian.rolap.RolapSchemaLoader.getOrCreateDimension(RolapSchemaLoader.java:2724)
at mondrian.rolap.RolapSchemaLoader.createCube(RolapSchemaLoader.java:1563)
at mondrian.rolap.RolapSchemaLoader.loadStage2(RolapSchemaLoader.java:420)
at mondrian.rolap.RolapSchemaLoader.loadStage1(RolapSchemaLoader.java:336)
The Mondrian 4 schema is this one:
Any idea?
I'm trying to implement a couple of degenerate dimensions on Mondrian 4 but haven't been able to do it. The docs still have the "old" way.
This is my Mondrian 3 schema that works fine:
Code:
<Schema name="cvdelop">
<Cube name="Sales" visible="true" cache="true" enabled="true">
<Table name="sales" schema="public" alias="">
</Table>
<Dimension type="StandardDimension" visible="true" foreignKey="store" name="Store">
<Hierarchy name="Store" visible="true" hasAll="true">
<Level name="Store" visible="true" column="store" uniqueMembers="false">
</Level>
</Hierarchy>
</Dimension>
<Dimension type="StandardDimension" visible="true" foreignKey="city" name="City">
<Hierarchy name="City" visible="true" hasAll="true">
<Level name="City" visible="true" column="city" uniqueMembers="false">
</Level>
</Hierarchy>
</Dimension>
<Measure name="Units Sold" column="unitssold" aggregator="sum" visible="true">
</Measure>
</Cube>
</Schema>
mondrian.rolap.RolapSchema$MondrianSchemaException: table must be specified (in Attribute 'City') (at line 0, column 215)
at mondrian.rolap.RolapSchemaLoaderHandlerImpl.error(RolapSchemaLoaderHandlerImpl.java:117)
at mondrian.rolap.RolapSchemaLoader.createColumnList(RolapSchemaLoader.java:3999)
at mondrian.rolap.RolapSchemaLoader.createAttribute(RolapSchemaLoader.java:3397)
at mondrian.rolap.RolapSchemaLoader.getOrCreateDimension(RolapSchemaLoader.java:2724)
at mondrian.rolap.RolapSchemaLoader.createCube(RolapSchemaLoader.java:1563)
at mondrian.rolap.RolapSchemaLoader.loadStage2(RolapSchemaLoader.java:420)
at mondrian.rolap.RolapSchemaLoader.loadStage1(RolapSchemaLoader.java:336)
The Mondrian 4 schema is this one:
Code:
<?xml version='1.0'?>
<Schema name='cdvelop' metamodelVersion='4.0'>
<PhysicalSchema>
<Table name='sales' />
</PhysicalSchema>
<Cube name='Sales'>
<Dimensions>
<Dimension name='City' key='City'>
<Attributes>
<Attribute name='City' keyColumn='city' hasHierarchy='false'/>
</Attributes>
<Hierarchies>
<Hierarchy name="City" hasAll="true">
<Level attribute="City"/>
</Hierarchy>
</Hierarchies>
</Dimension>
<Dimension name='Store' key='Store'>
<Attributes>
<Attribute name='Store' keyColumn='store' hasHierarchy='false'/>
</Attributes>
<Hierarchies>
<Hierarchy name="Store" hasAll="true">
<Level attribute="Store"/>
</Hierarchy>
</Hierarchies>
</Dimension>
</Dimensions>
<MeasureGroups>
<MeasureGroup name='Sales' table='sales'>
<Measures>
<Measure name='Units sold' column='unitssold' aggregator='sum' formatString='#,###'/>
</Measures>
<DimensionLinks>
<ForeignKeyLink dimension='City' foreignKeyColumn='city'/>
<ForeignKeyLink dimension='Store' foreignKeyColumn='store'/>
</DimensionLinks>
</MeasureGroup>
</MeasureGroups>
</Cube>
</Schema>
↧
Jobstep "Get a file with SFTP" fails
if Target Directory contains variables like ${Internal.Job.Filename.Directory}/subdir/
2015/04/01 12:47:38 - Get a file with SFTP - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Error getting files from SFTP :
2015/04/01 12:47:38 - Get a file with SFTP - 4:
2015/04/01 12:47:38 - Get a file with SFTP - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleJobException:
2015/04/01 12:47:38 - Get a file with SFTP - 4:
2015/04/01 12:47:38 - Get a file with SFTP -
2015/04/01 12:47:38 - Get a file with SFTP -
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.entries.sftp.SFTPClient.get(SFTPClient.java:220)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.entries.sftp.JobEntrySFTP.execute(JobEntrySFTP.java:684)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.Job.execute(Job.java:716)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.Job.execute(Job.java:859)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.Job.execute(Job.java:532)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.Job.run(Job.java:424)
2015/04/01 12:47:38 - Get a file with SFTP - Caused by: 4:
2015/04/01 12:47:38 - Get a file with SFTP - at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:902)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.entries.sftp.SFTPClient.get(SFTPClient.java:218)
2015/04/01 12:47:38 - Get a file with SFTP - ... 5 more
2015/04/01 12:47:38 - Get a file with SFTP - Caused by: java.io.FileNotFoundException: E:\pdi-ce-5.3.0.0-213\data-integration\file:\C:\Users\...\...\..\..\item-20120806190027-261853.sku (The filename, directory name or volume label syntax is incorrect)
2015/04/01 12:47:38 - Get a file with SFTP - at java.io.FileOutputStream.open(Native Method)
2015/04/01 12:47:38 - Get a file with SFTP - at java.io.FileOutputStream.<init>(FileOutputStream.java:206)
2015/04/01 12:47:38 - Get a file with SFTP - at java.io.FileOutputStream.<init>(FileOutputStream.java:95)
2015/04/01 12:47:38 - Get a file with SFTP - at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:878)
2015/04/01 12:47:38 - Get a file with SFTP - ... 6 more
is this bug?
Please help.
2015/04/01 12:47:38 - Get a file with SFTP - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Error getting files from SFTP :
2015/04/01 12:47:38 - Get a file with SFTP - 4:
2015/04/01 12:47:38 - Get a file with SFTP - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleJobException:
2015/04/01 12:47:38 - Get a file with SFTP - 4:
2015/04/01 12:47:38 - Get a file with SFTP -
2015/04/01 12:47:38 - Get a file with SFTP -
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.entries.sftp.SFTPClient.get(SFTPClient.java:220)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.entries.sftp.JobEntrySFTP.execute(JobEntrySFTP.java:684)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.Job.execute(Job.java:716)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.Job.execute(Job.java:859)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.Job.execute(Job.java:532)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.Job.run(Job.java:424)
2015/04/01 12:47:38 - Get a file with SFTP - Caused by: 4:
2015/04/01 12:47:38 - Get a file with SFTP - at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:902)
2015/04/01 12:47:38 - Get a file with SFTP - at org.pentaho.di.job.entries.sftp.SFTPClient.get(SFTPClient.java:218)
2015/04/01 12:47:38 - Get a file with SFTP - ... 5 more
2015/04/01 12:47:38 - Get a file with SFTP - Caused by: java.io.FileNotFoundException: E:\pdi-ce-5.3.0.0-213\data-integration\file:\C:\Users\...\...\..\..\item-20120806190027-261853.sku (The filename, directory name or volume label syntax is incorrect)
2015/04/01 12:47:38 - Get a file with SFTP - at java.io.FileOutputStream.open(Native Method)
2015/04/01 12:47:38 - Get a file with SFTP - at java.io.FileOutputStream.<init>(FileOutputStream.java:206)
2015/04/01 12:47:38 - Get a file with SFTP - at java.io.FileOutputStream.<init>(FileOutputStream.java:95)
2015/04/01 12:47:38 - Get a file with SFTP - at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:878)
2015/04/01 12:47:38 - Get a file with SFTP - ... 6 more
is this bug?
Please help.
↧
Install BI Server 5.3 (Windows 7) Java Path Issue
I am brand new to Pentaho and am attempting to install Pentaho BI Server 5.3.0.0-213 on a Windows 7 64 computer.
As I saw in several instructions posted online, I set my Environment variable for my Java installation. This is what happens when I try to start the BI Server application:
==================================
c:\Program Files\Pentaho\biserver-ce>start-pentaho.bat
DEBUG: Using JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jdk1.7.0_21
DEBUG: _PENTAHO_JAVA=C:\Program Files\Java\jdk1.7.0_21\bin\java.exe
The system cannot find the path specified.
'startup' is not recognized as an internal or external command,
operable program or batch file.
c:\Program Files\Pentaho\biserver-ce>
==================================
I have several installations or at least Java directories:
C:\Program Files\Java\jdk1.7.0_21
C:\Program Files\Java\jre7
C:\Program Files (x86)\Java\jre1.6.0_22
C:\Program Files (x86)\Java\jre1.7.0_11
C:\Program Files (x86)\Java\jre6
C:\Program Files (x86)\Java\jre7
I've tried setting my environment variable for each of these paths with the same result correctly referencing the defined path but with the same message that the path is not found.
Does anyone have any ideas or advice?
As I saw in several instructions posted online, I set my Environment variable for my Java installation. This is what happens when I try to start the BI Server application:
==================================
c:\Program Files\Pentaho\biserver-ce>start-pentaho.bat
DEBUG: Using JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jdk1.7.0_21
DEBUG: _PENTAHO_JAVA=C:\Program Files\Java\jdk1.7.0_21\bin\java.exe
The system cannot find the path specified.
'startup' is not recognized as an internal or external command,
operable program or batch file.
c:\Program Files\Pentaho\biserver-ce>
==================================
I have several installations or at least Java directories:
C:\Program Files\Java\jdk1.7.0_21
C:\Program Files\Java\jre7
C:\Program Files (x86)\Java\jre1.6.0_22
C:\Program Files (x86)\Java\jre1.7.0_11
C:\Program Files (x86)\Java\jre6
C:\Program Files (x86)\Java\jre7
I've tried setting my environment variable for each of these paths with the same result correctly referencing the defined path but with the same message that the path is not found.
Does anyone have any ideas or advice?
↧
Sequence start value not 1
Hi, I've created a sequence and I want it to start from a value that I select from a table.
Can anyone tell me how please.
Can anyone tell me how please.
↧
↧
Compare tablea
Hello everyone, I need ur help
I wanna compare 2 tables :
old_table(id_old, name_old)
test_table(id_test, name_test)
I want to compare all the rows from old_table with the rows in test_table and then insert the difference (rows that are in old_table and not in test_table) in test_table.
I think compare table do this kind of work, but I don't know how to use it can u gimme some help please :o:o
I wanna compare 2 tables :
old_table(id_old, name_old)
test_table(id_test, name_test)
I want to compare all the rows from old_table with the rows in test_table and then insert the difference (rows that are in old_table and not in test_table) in test_table.
I think compare table do this kind of work, but I don't know how to use it can u gimme some help please :o:o
↧
want to remove negative part of scale of waterfall chart
Hi all,
i have a waterfall chart in my dashboard.It shows negative scale part for some values and for some values its not shows.I want to remove this negative scale part for all values.Please help.
i have a waterfall chart in my dashboard.It shows negative scale part for some values and for some values its not shows.I want to remove this negative scale part for all values.Please help.
↧
Calculating closing ratio across multiple time frames
I need to calculate a closing ratio where sales leads are counted in the current month, but sales can be counted across multiple months. I am then filtering the data based on the sales leads - however when doing so, I am only counting the current months leads (which I want) but I may loose some sales due to the fact that they can happen in any previous month.
Example:
Lead1 = Sale1, L2 = S2, etc...
Jan = L1, L2, L3, S1 = Closing ratio of 33%
Feb = L4, L5, L6, S2 = Closing ratio of 33%
Mar = L7 L8 L9, S3, S4 = Closing ratio of 66%
Even though the lead(L2) for S2 happened in Jan, the sale was counted in Feb. Same thing for S3 and S4, the leads came in previous months but the sales need to be counted in the month they are sold (it is possible to therefore have a closing ratio more than 100%).
We have one fact table with granularity of one row per lead. Two date dimensions of lead date and sold date. Flag that indicates if the lead was sold or not.
Does anyone have any solutions so that I can create a single table in analyzer that has the correct lead and sale count when filter by lead month?
Example:
Lead1 = Sale1, L2 = S2, etc...
Jan = L1, L2, L3, S1 = Closing ratio of 33%
Feb = L4, L5, L6, S2 = Closing ratio of 33%
Mar = L7 L8 L9, S3, S4 = Closing ratio of 66%
Even though the lead(L2) for S2 happened in Jan, the sale was counted in Feb. Same thing for S3 and S4, the leads came in previous months but the sales need to be counted in the month they are sold (it is possible to therefore have a closing ratio more than 100%).
We have one fact table with granularity of one row per lead. Two date dimensions of lead date and sold date. Flag that indicates if the lead was sold or not.
Does anyone have any solutions so that I can create a single table in analyzer that has the correct lead and sale count when filter by lead month?
↧
DB2 Connectivity issue in Spoon 5.3
While attempting to connect with IBM DB2 database, it always errors out with message as following. Please provide any pointers or solution to the problem.
Appreciate that and Thanks in Advance.
Message from connectivity test:
Error connecting to database [test] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Driver class 'com.ibm.db2.jcc.DB2Driver' could not be found, make sure the 'IBM DB2' driver (jar file) is installed.
com.ibm.db2.jcc.DB2Driver
org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Driver class 'com.ibm.db2.jcc.DB2Driver' could not be found, make sure the 'IBM DB2' driver (jar file) is installed.
com.ibm.db2.jcc.DB2Driver
at org.pentaho.di.core.database.Database.normalConnect(Database.java:417)
at org.pentaho.di.core.database.Database.connect(Database.java:357)
at org.pentaho.di.core.database.Database.connect(Database.java:310)
at org.pentaho.di.core.database.Database.connect(Database.java:300)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:80)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2685)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.test(DatabaseDialog.java:109)
at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizardPage2.test(CreateDatabaseWizardPage2.java:157)
at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizardPage2$3.widgetSelected(CreateDatabaseWizardPage2.java:147)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizard.createAndRunDatabaseWizard(CreateDatabaseWizard.java:111)
at org.pentaho.di.ui.spoon.Spoon.createDatabaseWizard(Spoon.java:7635)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:43)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:106)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1316)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7979)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9310)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:654)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Driver class 'com.ibm.db2.jcc.DB2Driver' could not be found, make sure the 'IBM DB2' driver (jar file) is installed.
com.ibm.db2.jcc.DB2Driver
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:491)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:400)
... 43 more
Caused by: java.lang.ClassNotFoundException: com.ibm.db2.jcc.DB2Driver
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:466)
... 44 more
Caused by: java.util.zip.ZipException: invalid LOC header (bad signature)
at java.util.zip.ZipFile.read(Native Method)
at java.util.zip.ZipFile.access$1400(Unknown Source)
at java.util.zip.ZipFile$ZipFileInputStream.read(Unknown Source)
at java.util.zip.ZipFile$ZipFileInflaterInputStream.fill(Unknown Source)
at java.util.zip.InflaterInputStream.read(Unknown Source)
at sun.misc.Resource.getBytes(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
... 51 more
Appreciate that and Thanks in Advance.
Message from connectivity test:
Error connecting to database [test] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Driver class 'com.ibm.db2.jcc.DB2Driver' could not be found, make sure the 'IBM DB2' driver (jar file) is installed.
com.ibm.db2.jcc.DB2Driver
org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Driver class 'com.ibm.db2.jcc.DB2Driver' could not be found, make sure the 'IBM DB2' driver (jar file) is installed.
com.ibm.db2.jcc.DB2Driver
at org.pentaho.di.core.database.Database.normalConnect(Database.java:417)
at org.pentaho.di.core.database.Database.connect(Database.java:357)
at org.pentaho.di.core.database.Database.connect(Database.java:310)
at org.pentaho.di.core.database.Database.connect(Database.java:300)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:80)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2685)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.test(DatabaseDialog.java:109)
at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizardPage2.test(CreateDatabaseWizardPage2.java:157)
at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizardPage2$3.widgetSelected(CreateDatabaseWizardPage2.java:147)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizard.createAndRunDatabaseWizard(CreateDatabaseWizard.java:111)
at org.pentaho.di.ui.spoon.Spoon.createDatabaseWizard(Spoon.java:7635)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:43)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:106)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1316)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7979)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9310)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:654)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Driver class 'com.ibm.db2.jcc.DB2Driver' could not be found, make sure the 'IBM DB2' driver (jar file) is installed.
com.ibm.db2.jcc.DB2Driver
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:491)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:400)
... 43 more
Caused by: java.lang.ClassNotFoundException: com.ibm.db2.jcc.DB2Driver
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:466)
... 44 more
Caused by: java.util.zip.ZipException: invalid LOC header (bad signature)
at java.util.zip.ZipFile.read(Native Method)
at java.util.zip.ZipFile.access$1400(Unknown Source)
at java.util.zip.ZipFile$ZipFileInputStream.read(Unknown Source)
at java.util.zip.ZipFile$ZipFileInflaterInputStream.fill(Unknown Source)
at java.util.zip.InflaterInputStream.read(Unknown Source)
at sun.misc.Resource.getBytes(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
... 51 more
↧
↧
Update step into Postgres Partition
We have an insert biased transformation that attempts to insert rows before attempting to update them. We do this because the bulk of the rows we're processing are inserts, and Postgres doesn't have built in upsert functionality. We'd like to keep the transformation insert biased, so that we don't incur the cost of attempting an update on each row. Instead, we attempt an insert and if it fails due to a unique constraint violation, we send the row to an alternative path for updates.
The update path works well on a monolithic table, but using a partitioned Postgres table creates additional problems. Kettle's 'Table Output' step gives the option of specifying which table should be written to on a row-by-row basis, and we use this when we insert to specify which partition table the row belongs to. Kettle's 'Update' step doesn't give the same option, which results in our not being able to utilize Postgres partition pruning. As a result, Updates become very slow.
We've tried:
1. Using a Java User Class to open a connection, execute our query against the partition table, and close the connection.
2. Writing a SQL update in the SQL Execution step.
Each of these solutions ends up committing one row at a time (rather than hundreds). This is slow.
We believe that a Java User Class might still be the answer, if we could open our connection outside of the processRow() method and set it to commit after each 100 (n) rows. The problems we see here are having access to a more global Database object, committing whatever statements are left uncommited, and closing the connection when finished.
Java User Class:
Ideally, the Update step would include the same option as the Table Output step, "Is the name of the table defined in a field? / Field that contains name of table:", so that we could both specify a partition table to update and benefit from having larger commit sizes. In lieu of this, what solutions are available to us?
Thanks,
Will
The update path works well on a monolithic table, but using a partitioned Postgres table creates additional problems. Kettle's 'Table Output' step gives the option of specifying which table should be written to on a row-by-row basis, and we use this when we insert to specify which partition table the row belongs to. Kettle's 'Update' step doesn't give the same option, which results in our not being able to utilize Postgres partition pruning. As a result, Updates become very slow.
We've tried:
1. Using a Java User Class to open a connection, execute our query against the partition table, and close the connection.
2. Writing a SQL update in the SQL Execution step.
Each of these solutions ends up committing one row at a time (rather than hundreds). This is slow.
We believe that a Java User Class might still be the answer, if we could open our connection outside of the processRow() method and set it to commit after each 100 (n) rows. The problems we see here are having access to a more global Database object, committing whatever statements are left uncommited, and closing the connection when finished.
Java User Class:
Code:
// before
Database db = new Database(dbMeta);
db.setCommit(100);
db.connect();
// after
db.disconnect();
public boolean processRow(StepMetaInterface smi, StepDataInterface sdi) throws KettleException {
String sql = ...;
PreparedStatement ps = db.prepareSQL(sql);
ps.execute();
}
Thanks,
Will
↧
MongoDB Input by ObjectID range
Dear ALL,
I need advice on how to write a proper query to select by a date range (using separately generated ObjectID string) from a MongoDB collection. In a nutshell I want to pull mongo documents for 1 day as defined by the ObjectID. Something like:
'where ObjectID between "5510a9000000000000000000" and "55110b700000000000000000"' or
'where ObjectID >= "5510a9000000000000000000" and ObjectID < "55110b700000000000000000"'
The following query works fine to retrieve rows (documents) with ObjectID>="5510a9000000000000000000":
{ "$query": {_id: {$gte: { $oid: "5510a9000000000000000000"}}}}
However, when I want to add an AND condition, the query does not seem to use the upper limit like ObjectID<"55110b700000000000000000". Possibly because I am adding the AND conditions incorrectly...
{ "$query": {_id: {$gte: { $oid: "5510a9000000000000000000"}}} , {_id: {$lt: {$oid: "55110b700000000000000000"}}} }
Any advice is appreciated!
-Art
I need advice on how to write a proper query to select by a date range (using separately generated ObjectID string) from a MongoDB collection. In a nutshell I want to pull mongo documents for 1 day as defined by the ObjectID. Something like:
'where ObjectID between "5510a9000000000000000000" and "55110b700000000000000000"' or
'where ObjectID >= "5510a9000000000000000000" and ObjectID < "55110b700000000000000000"'
The following query works fine to retrieve rows (documents) with ObjectID>="5510a9000000000000000000":
{ "$query": {_id: {$gte: { $oid: "5510a9000000000000000000"}}}}
However, when I want to add an AND condition, the query does not seem to use the upper limit like ObjectID<"55110b700000000000000000". Possibly because I am adding the AND conditions incorrectly...
{ "$query": {_id: {$gte: { $oid: "5510a9000000000000000000"}}} , {_id: {$lt: {$oid: "55110b700000000000000000"}}} }
Any advice is appreciated!
-Art
↧
Replacing fields under certain conditions
Hi all,
I'm working with a table from a sql database and I would like to apply certain conditions to certain fields (columns) in this table. For example, if I have 2 columns (strings, integers), I would like to check whether the rows of the column my_operation contain the string "duplicate", in which case I would multiply the second column (my_result) by two. In the rest of the cases I wouldn't perform any operation.
Can I do this without using any SQL/JavaScript code? If not, what is the best alternative? In a first step, I'm applying switch/case but when trying to replace the fields I'm lost.
I'm a bit newbie working with PDI and I would really appreciate some help.
Thank you! :)
I'm working with a table from a sql database and I would like to apply certain conditions to certain fields (columns) in this table. For example, if I have 2 columns (strings, integers), I would like to check whether the rows of the column my_operation contain the string "duplicate", in which case I would multiply the second column (my_result) by two. In the rest of the cases I wouldn't perform any operation.
Can I do this without using any SQL/JavaScript code? If not, what is the best alternative? In a first step, I'm applying switch/case but when trying to replace the fields I'm lost.
I'm a bit newbie working with PDI and I would really appreciate some help.
Thank you! :)
↧