Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Merge rows (diff) weird error....

$
0
0
Hello.

I'm trying to use the Merge rows diff in combination with the Synchronize after merge.

First time I run when there's no data in my target table, it works fine. If I try to rerun right away the transformation a second time without chaging any data, the transformation fails with the following error:

"OrderDate Timestamp : Comparing values can not be done with data type : 9"

Funny thing is that all my "OrderDate" are valid dates... It's a datetime datatype (SQL Server) and I'm comparing with the same datatype in the target table.

Getting a bit frustrated because it seems like a bug (don't understand why my date fields are seen as String from the Table Input either considering that they are datetime...) and if it is a bug, it means I can't use the Merge/Synchronize so I'm a bit screwed :(

Any help would be appreciated.

Job executing eventhough it is not in the GUI

$
0
0
Hello,

I am using spoon 4.2.1 and I have a job scheduled to start once a month, which it does. The strange thing is that it executes twice, once when it is supposed to and once at the time it had been set to a long time ago.

I understand that a job can only execute if it is in the GUI, am I wrong? hence if it is not shown in the GUI it should not execute ...

Any idea why this can happen? The server has already been restarted

Regards

Are You Available : ETL Pentaho Developer, United States - Onshore

$
0
0
Hi,

This is Rahul, I am recruiter based out of San Francisco. We are looking for Competent Candidates having experience on Pentaho. To join one of our clients in United States, California as a ETL Pentaho Developer/ Business Intillegence. In case if you are relocating within states, we could discuss regading relocating assistance


Plese let me know your availability.

You may touch base with me on 209-757-4232. Also email me on rahul@nextgentechinc.com.

Regards,

ERROR: No repository provided, can't load transformation.

$
0
0
hello,
I have a repository, and want to run a transformation from the repository from MS-DOS
I am running the following commands from cmd line:




INFO 02-05 11:34:34,114 - Pan - Logging is at level : Detailed logging
INFO 02-05 11:34:34,115 - Pan - Start of run.
INFO 02-05 11:34:34,194 - org.pentaho.di.core.util.ResolverUtil@60d60cd6 - Scanning for classes in [file:/C:/kettle/data-integration/lib/kettle-engine.jar] matching criteria: [Lorg.pentaho.di.core.util.ResolverUtil$Test;@4e4e745
INFO 02-05 11:34:34,773 - Cargador de Paso - Buscando extensiones en directorio: plugins\steps
INFO 02-05 11:34:34,781 - org.pentaho.di.core.util.ResolverUtil@54891f43 - Scanning for classes in [file:/C:/kettle/data-integration/lib/kettle-engine.jar] matching criteria: [Lorg.pentaho.di.core.util.ResolverUtil$Test;@76639310
INFO 02-05 11:34:35,087 - Using "C:\Users\lpinzonm\AppData\Local\Temp\vfs_cache" as temporary files store.
INFO 02-05 11:34:35,202 - DBCache - Loading database cache from file: [C:\Users\lpinzonm\.kettle\db.cache]
INFO 02-05 11:34:35,231 - DBCache - We read 13 cached rows from the database cache!
ERROR: No repository provided, can't load transformation.

I tried with the following lines with the same error:

C:\kettle\data-integration>Pan.bat /rep: "catalogo SISEC" /user:admin /pass:admin /trans: "firbird_to_oracle" /level:Detailed
C:\kettle\data-integration>Pan.bat /rep: catalogo SISEC /dir:/ /user:admin /pass:admin /trans: firbird_to_oracle /level:Detailed

pd: do not want to run them from a xxxx.ktr only from the repository

thanks for the help

LP:D
Attached Images

Problem deploying CDE Dashboard

$
0
0
How should I deploy a dashboard which I've created using CDE running in my local Pentaho BI Server to another BI Server (production)?

I've attempted to send the .cdfde file created, but didn't worked.

Then I tried to copy de solution folder, unsuccessfully again.

I wondering what I'm missing.

Thank you,
Uilian.

Issue with CDC on Pentaho 5

$
0
0
We are experimenting with caching options in order to improve query performance in our Pentaho 5 EE environment. We do use Analyzer, but we also make heavy use of custom applications that rely on CDA and XMLA calls for access to data through MDX, SQL, and also using kettle endpoints. I'd like to have better visibility into cache utilization, better control over refreshing portions of the cache, and better ability to scale the cache as needed.

We looked into using Infinispan but, from what I can gather, it only provides benefits for Analyzer.

So, we did a little research and came across the Community Distributed Cache (CDC) plugin. This looks really promising and we'd like to give it a go. We dried installing it on a local 5.0.1 CE server to do some quick tests and ran into the following error when starting the server:

12:01:17,234 ERROR [Logger] misc-class org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager: PluginManager.ERROR_0015 - Plugin cdc loaded, but post-load processing failed
java.lang.NoSuchMethodError: org.pentaho.platform.engine.core.system.PentahoSystem.getUserDetailsRoleListService()Lorg/pentaho/platform/api/engine/IUserDetailsRoleListService;
at pt.webdetails.cdc.plugin.CdcConfig.getAdminSession(CdcConfig.java:176)
at pt.webdetails.cdc.plugin.CdcConfig.<init>(CdcConfig.java:53)
at pt.webdetails.cdc.plugin.CdcConfig.getConfig(CdcConfig.java:60)

I did some research, and it looks like the getUserDetailsRoleListService() method was there in PentahoSystem in version 4.8 but it's not there in 5.0. It's odd, because I can see when I poke around in the commits here that CDC was changed to use this method in the second half of 2013.

So, to summarize, two questions:
1. Is CDC known to work with Pentaho 5? If so, perhaps I made a mistake in my install?
2. If CDC has a known incompatibility with Pentaho 5, does anyone know if it's going to be updated?

I'd also love to hear any feedback folks have on CDC or any other ways to improve MDX performance with large result sets. We're currently running Infobright CE and Pentaho EE and large JVM to try and cache as much as possible.

Thanks!

Output Step Metrics - Lines Output

$
0
0
Hello,

I'm using the Merge Rows (Diff) with the Synchronize after Merge steps. I also have the Output Steps Metrics to store information about the records reads, written etc..

There's something I don't get about this Output Step Metrics. Let's say that my target table is empty and my source contains 10,000 records. When outputting data from the Output Step Metric, I have the following data:

Source (Table Input):
LinesInput : 10,000
LinesOutput : 0
LinesRead : 0
LinesUpdated : 0
LinesWritten : 10,000
LinesRejected : 0

Target (that comes into the Merge Rows Diff):
All zero because there's no data yet

Synchronize after Merge:
LinesInput : 0
LinesOutput : 20,000
LinesRead : 10,000
LinesUpdated : 0
LinesWritten : 10,000
LinesRejected : 0


Why is the LinesOuput equals to 20,000 ???

I'm running PDI 5.0.1 under Windows 8.1 by the way.
Thanks for the help, I really would like to understand this number...

Cognos Online Training | Online Cognos Training USA | UK | INDIA

$
0
0
We are proudly announcing that we are launching another online training institute that is the great TEKSON IT Services. Through this project we are providing many types of courses like Java, Data Warehouse and many more, from that one of the best course is Cognos Online Training Worldwide. We have International IT professional trainers. They Covers each and every topic related to Cognos through Online, Some of the main topics are:


  1. Characteristics of Dataware Housing
  2. OLTP Vs OLAP Databases
  3. Different Approaches of DWH
  4. Data mart Vs Data Warehouse
  5. ETL and Reporting tools
  6. Introduction to Cognos
  7. About Cognos 8.4
  8. Tier Architecture
  9. Cognos Release (EP series7, Reportnet , Cognos 8.0/8,1/8.2/8.3)
  10. Features of Cognos 8.4
  11. Cognos Vs other OLAP tools
  12. Cognos components (Modeling & Reporting)
  13. Different cognos services
  14. Cognos Connections
  15. Personalize cognos connection
  16. Cognos configuration, content Store


And many sub topics are there for more details please go through the website.

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Number: USA: +1 010-674-9448,
INDIA: +91 939-185-5249,

Email: teksonit@gmail.com,

Web: http://www.teksonit.com/cognos-online-training/

HBase input step filter

$
0
0
Hello guys,
i am testing the step "HBase Input" but i have some problems filtering data...


i had followed the tutorial bellow
http://wiki.pentaho.com/display/EAI/HBase+Input


I have 1 table called TEST with one column family


Create the table
create 'test', 'cf'
add some data
put 'test', 'row1', 'cf:a', 'value1'
put 'test', 'row1', 'cf:b','value2'
put 'test', 'row2', 'cf:a', 'value3'
so...
my problem is filtering data in the step Hbase Input. in my tab "filter result set" i added one filter:


Alias: cf
type: String
Operator: substring
Comparision value: value2


but still getting all the result set without filtering...




any help?


thanks a lot!

Core Java Online Training With real time projects Worldwide

$
0
0
TEKSONIT provides the best Software’s training for various Computer IT courses through Webex, Gotomeeting.We also provides

Class room course to contend with today’s competitive IT world. Learners can grasp the technology-subject from our highly

experienced & certified trainers which will be helping the students to work in real time projects. Our team of well

experienced Core Java trainers with vast real time IT experience in online trainings is dedicated towards providing quality

training.

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Number: USA : +1 010 674 9448, IND : +91 93918 55249,

Email: teksonit@gmail.com

Web:http://www.teksonit.com/core-java-online-training/

Best Ab Initio Online Training by Vast Experienced Trainers

$
0
0
Ab Initio Online Training in INDIA presenting by Teksonit with real time scenario. It is great opportunity whoever wants to learn Ab Initio courses through online because we have highly dedicated and experienced Ab Initio professional trainers. Ab Initio Online Training not only in INDIA and also worldwide countries like USA, UK, CANADA, SINGAPORE and many more countries. You can learn Ab Initio courses from anywhere in the world through Online.

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Number: India: +91 93918 55249, USA: +1 010-674-9448

Email: teksonit@gmail.com ,

Web: http://www.teksonit.com/ab-initio-online-training/

How to set Table outpt Step for Multiple Input queries: Kettle JAva API

Comprobación de la existencia de la clave ajena de la base de hechos en una dimensión

$
0
0
Hola a todos, tengo que realizar lo siguiente:

Durante la carga es posible que aparezcan registros en la tabla de hechos para los cuales no se dispone de la información correspondiente en algunas de las dimensiones. Realiza una comparación de los datos existentes en cada dimensión antes de cargar cada registro de la tabla de hechos. En caso de que no exista alguno de los datos, desvía la salida a un fichero de texto "excepciones.csv".

Por ejemplo, tengo una dimensión tarifas, que solo tiene información de la 1 a la 4(ids). Pero en el .csv de entrada de la base de hechos, hay filas que tienen tarifas que van hasta la 20. Pasa lo mismo con otras dimensiones. La base de hechos tiene información que algunas dimensiones no posee. Por eso en vez de saltar el error, porque no encuentra referencia a esas claves, sería desviar esas filas al archivo excepciones.csv.

El problema es que no se me ocurre como hacer ese filtrado.




Un saludo a todos!!

Checking of the existence of the foreign key of the base of facts in a dimension

$
0
0
Hello to all, I have to realize the following thing: During the load it is possible that records appear in the table of facts for which he does not arrange of the corresponding information in any of the dimensions. It realizes a comparison of the existing information in every dimension before loading every record of the table of facts. In case someone of the information does not exist, it turns the exit aside to a file of text "excepciones.csv". For example, I have a dimension rates, which alone information of the 1 has to 4 (ids). But in the .csv of entry of the base of facts, there are rows that have rates that go even 20. The same thing happens with other dimensions. The base of facts has information that does not possess any dimensions. Because of it instead of jumping the mistake, because he does not find reference to these keys, it would be to turn these rows aside to the file "excepciones.csv"

The problem is that it does not happen to me like to do it. I need help.


-----------------------------------------------------------------------------------------------------------------------------------------------------

Hola a todos, tengo que realizar lo siguiente:

Durante la carga es posible que aparezcan registros en la tabla de hechos para los cuales no se dispone de la información correspondiente en algunas de las dimensiones. Realiza una comparación de los datos existentes en cada dimensión antes de cargar cada registro de la tabla de hechos. En caso de que no exista alguno de los datos, desvía la salida a un fichero de texto "excepciones.csv".

Por ejemplo, tengo una dimensión tarifas, que solo tiene información de la 1 a la 4(ids). Pero en el .csv de entrada de la base de hechos, hay filas que tienen tarifas que van hasta la 20. Pasa lo mismo con otras dimensiones. La base de hechos tiene información que algunas dimensiones no posee. Por eso en vez de saltar el error, porque no encuentra referencia a esas claves, sería desviar esas filas al archivo excepciones.csv.

El problema es que no se me ocurre como hacer ese filtrado.

Comprobación de la existencia de la clave ajena de la base de hechos en una dimensión

$
0
0
Hola a todos, tengo que realizar lo siguiente:

Durante la carga es posible que aparezcan registros en la tabla de hechos para los cuales no se dispone de la información correspondiente en algunas de las dimensiones. Realiza una comparación de los datos existentes en cada dimensión antes de cargar cada registro de la tabla de hechos. En caso de que no exista alguno de los datos, desvía la salida a un fichero de texto "excepciones.csv".

Por ejemplo, tengo una dimensión tarifas, que solo tiene información de la 1 a la 4(ids). Pero en el .csv de entrada de la base de hechos, hay filas que tienen tarifas que van hasta la 20. Pasa lo mismo con otras dimensiones. La base de hechos tiene información que algunas dimensiones no posee. Por eso en vez de saltar el error, porque no encuentra referencia a esas claves, sería desviar esas filas al archivo excepciones.csv.

El problema es que no se me ocurre como hacer ese filtrado.

Possible bug getting integer OUT MySQL parameter in stored procedure step

$
0
0
Hello all!

I'm having big trouble retrieving OUT parameter from MYSQL stored procedure to stream. I think it's maybe a kind of bug becouse it only occurs with Integer out parameter, it works with String out parameter. The exception I get is:
Code:

Invalid value for getLong() - '
I think the parameters are correctly set as you can see in the ktr.

You can replicate the bug in this way:

Schema

Code:

create schema if not exists test;
use test;
DROP PROCEDURE IF EXISTS procedure_test;
delimiter $$
CREATE PROCEDURE procedure_test(IN in_param INT UNSIGNED, OUT out_param INT UNSIGNED)
BEGIN
    SET out_param := in_param;
END
$$

KTR
I attach .ktr file with the steps. You just have to setup the connection to MySQL test schema to try it.

Other data
- MySQL connector: 5.1.30
- MySQL version: 5.5
- Kettle version: 5.0.1-stable
- OS: Ubuntu 12.04

Any help from the community will be highly appreciated. Thanks in advance!
Attached Files

[Tutorial] The Making of a Pretty Web Stats Report with Pentaho Report Designer

[how to define tablename and pathfile from variable]

$
0
0
Hi,

iam still beginer, and need a lot of help,
what i wanna do is, i make variable pathFile, and tableName, then i set that variable, assign the values from Get System info [command line argument 1], then i make 1 job [InsertFileToDB], i drop get variable, textFileInput, and table_output, i have try this link b4 getvariable1->textFileINput->tableOuput and getVariable2->tableOutput. it gave me error, then i change the link to : getVariable->textFileInput->tableOutput, firstly i cant pass the tableName, and then i check "pass through field from previous step", it work, but all variable from getVariable[pathFile, tableName] was passed, i dont need pathFIle variable in tableOutput...

can someOne help me??

thx and regards,
Hartz

Excel template input

$
0
0
Hi,

I have this excell template, and i need to extract the data from that excel to database. Do i need to specify the cell number or is there another to do it.

Thankstemplate.xls
Attached Files

Informatica Training Online by highly experienced professional trainers

$
0
0
Informatica Online Training course classes by SAP Centre with Highly qualified and excellent experienced trainers, we have started this Informatica classes with much passion of providing excellent best quality learning of Informatica learners we also providing complete set of material for our students so that you can go through that material and can understand in a proper manner.
SAP Centre advantages for our learners:

  1. Job oriented training
  2. Attend classes at you convenient time
  3. Recording facility for each and every class
  4. Excellent high quality web-conferencing technology
  5. Classes by real time 10+ years exp trainer
  6. Course covers all the Real time scenarios
  7. 100 % Placement assistance in USA, CANADA
  8. Resume Preparation, Projects explanation
  9. Latest version up-gradation and updates at free of cost
  10. 24×7 and 365 days 100% up-time dedicated server access
  11. Provide soft copy of the course material
  12. Interview preparation and guidance and clearance support will be provided
  13. Project and technical support given after completion of the course

And many more advanges are there, for full details please visit our website.
Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Number: India: +91 8686984442,

Email: saponlinecentre@gmail.com ,

Web: http://www.sapboonline.com/informatica-online-training/
Viewing all 16689 articles
Browse latest View live


Latest Images