Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Unable to get VFS File object for filename

$
0
0
Hi everybody
i am running kitchen on linux,but sometimes an error occured,the message is below:

Code:

Unable to get VFS File object for filename  '/home/hadaemon/.kettle/plugins' : Could not find file with URI  "/home/hadaemon/.kettle/plugins" because it is a relative path, and no base URI was provided.
this error is not occured every times ,i saw the code of Kettle i found the class KettleVFS handle the relative path and return a full path which use VFS
Code:

       
        fsm = new StandardFileSystemManager();
        try {
            fsm.setFilesCache(new WeakRefFilesCache());
            fsm.init();
        } catch (FileSystemException e) {
            e.printStackTrace();
        }       
       
        // Install a shutdown hook to make sure that the file system manager is closed
        // This will clean up temporary files in vfs_cache
        Runtime.getRuntime().addShutdownHook(new Thread(new Runnable(){
            @Override
          public void run() {
              if (fsm != null) {
                  System.out.println("destroy");
                  fsm.close();
              }
            }
        }));

Code:

FileSystemManager fsManager = getInstance().getFileSystemManager();
        ...
       
            if(fsOptions != null) {
              fileObject = fsManager.resolveFile(filename, fsOptions);
            } else {
              fileObject = fsManager.resolveFile(filename);
            }
           
            return fileObject;

from the test i know when the fsManager is closed the error will occured and from the code the fsManager will colsed at the jvm exit,why fsManager closed when i init Kettle anybody can help me!

Loop over array of transformation names and execute them concurrently

$
0
0
I have a large number of transformations that are all names similarly that I current run concurrently. The problem is that it's becoming rather unwieldily:

http://i1173.photobucket.com/albums/...ps312e5418.png

My thought was to create an array of these transformation names and dynamically load the transformation and run it concurrently as before. Is this sort of thing possible in Data Integration? If so how would I go about doing this?

Table component dataBar nit workung in Pentaho 5.2

$
0
0
Hi,
I made a dashbord with table component.
In table component I have a column type dataBar.
But the column stays empty.

When I import the the same dashboard in Pentaho 5.1 the dataBar shows.

In both Pentaho installation iam using CDF, CDA, CDE CGG 14.10.15.
In the Pentaho 5.2 installation I also tryed 5.2.0.0-209.

Has anybody the same issue?

Sven

Error while trying to insert rows into POSTGRESQL

$
0
0
Hello guys,

I have a DW scenario here and I want to populate my DW through stage using Kettle.

As the official scenario is quite complex and the error is also happening when doing simple tests, I am only posting the simple scenario here for you.

What I want to do:

Import rows from a TABLE INPUT (Firebird) into a TABLE OUTPUT (PostgreSQL).

I mapped all the fields and ran the transformation, which failed with the error appended to the end of this thread. Then I realized that I had some columns in upper case and this could cause problems. Then I recreated my table with all columns in lower cappital, but I still get the same error.

* When I copy the insert in the error message and run it directly in postgresql (same user account), it succeeds.

Please, assist.


014/11/19 17:06:34 - Table output.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Unexpected batch update error committing the database connection.
2014/11/19 17:06:34 - Table output.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseBatchException:
2014/11/19 17:06:34 - Table output.0 - Error updating batch
2014/11/19 17:06:34 - Table output.0 - Entrada em lote 0 INSERT INTO fin."fatofatdre" ("empresa", "ano", "mes", "valor_nf", "nivel1", "nivel2", "nivel3") VALUES ( 1, '2013', '06', '25934.85', 'GOVERNO', 'DESENVOLVIMENTO', 'PROJETO') foi abortada. Chame getNextException para ver a causa.
2014/11/19 17:06:34 - Table output.0 -
2014/11/19 17:06:34 - Table output.0 - at org.pentaho.di.core.database.Database.createKettleDatabaseBatchException(Database.java:1377)
2014/11/19 17:06:34 - Table output.0 - at org.pentaho.di.core.database.Database.emptyAndCommit(Database.java:1366)
2014/11/19 17:06:34 - Table output.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.dispose(TableOutput.java:571)
2014/11/19 17:06:34 - Table output.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:96)
2014/11/19 17:06:34 - Table output.0 - at java.lang.Thread.run(Thread.java:745)
2014/11/19 17:06:34 - Table output.0 - Caused by: java.sql.BatchUpdateException: Entrada em lote 0 INSERT INTO fin."fatofatdre" ("empresa", "ano", "mes", "valor_nf", "nivel1", "nivel2", "nivel3") VALUES ( 1, '2013', '06', '25934.85', 'GOVERNO', 'DESENVOLVIMENTO', 'PROJETO') foi abortada. Chame getNextException para ver a causa.
2014/11/19 17:06:34 - Table output.0 - at org.postgresql.jdbc2.AbstractJdbc2Statement$BatchResultHandler.handleError(AbstractJdbc2Statement.java:2743)
2014/11/19 17:06:34 - Table output.0 - at org.postgresql.core.v3.QueryExecutorImpl$1.handleError(QueryExecutorImpl.java:461)
2014/11/19 17:06:34 - Table output.0 - at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1928)
2014/11/19 17:06:34 - Table output.0 - at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:405)
2014/11/19 17:06:34 - Table output.0 - at org.postgresql.jdbc2.AbstractJdbc2Statement.executeBatch(AbstractJdbc2Statement.java:2892)
2014/11/19 17:06:34 - Table output.0 - at org.pentaho.di.core.database.Database.emptyAndCommit(Database.java:1353)
2014/11/19 17:06:34 - Table output.0 - ... 3 more
2014/11/19 17:06:34 - Table output.0 - Finished processing (I=0, O=0, R=234, W=0, U=0, E=1)
2014/11/19 17:06:34 - Transformação 1 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Erros detectados!
2014/11/19 17:06:34 - Spoon - The transformation has finished!!
2014/11/19 17:06:34 - Transformação 1 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Erros detectados!
2014/11/19 17:06:34 - Transformação 1 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Erros detectados!

Creating conditions in Analytic Query

$
0
0
Hi Can i create any conditions in Analytic Query. I can see in Analytic Function table there is a column name N and when we use it i am getting a list of functions. Can someone let me know how to use those functions. Please attach if there is any material to use those functions.

simulate a variable

$
0
0
So I have a stream of data whose content is variable but cyclic. So imagine information about patients patient A is identified in one line and the next 20 lines all pertain to patient A, then a new patient, patient B, followed by the same sequence etc.

so if the input looked like:
Patient Name: Leroy
Age 30
Sex no so much
Height short
Patient Name Mary
.
.
you get the idea
and my output needs to be:
Leroy 30 Not so Much Short
Mary 25 whenever tall

To do this I need to read in Leroy and remember that could hold (Leroy) it until I see a new patient.
I dont see how I can simulate setting a variable that can only change when I see a trigger, i.e. Patient Name in this case

Can this be done within the confines of Data Integration??

Report Web Viewer Issue

$
0
0
Greetings everyone
I currently have no idea why the Report Web Viewer doesn't not show the sample reports (steelwheels) , i know it's not enough info but im currently that lost and there is no info about it. i've tried publishing the other reports through the report designer and shows no problem but when i try to see the on the server just shows an empty page, its not loading or doing anything.
i remember in testing that everything was fine, but im not sure what happened.
Does anyone has had this kind of issue? does a plugin or some configuration could cause this?

i'm currently running CE server 5.0.1

getting around bug with TableOutput and PostgreSQL return auto-generated key

$
0
0
I just thought I would share a hack that gets around this http://jira.pentaho.com/browse/PDI-8069 bug where the returned key field must be the first field in a table you are inserting into.

If your table is defined like this:

Code:

CREATE TABLE my_table (
  some_field varchar(40) null,
  some_int integer null,
  my_table_pk SERIAL NOT NULL PRIMARY KEY, -- notice primary key not first field
  other_field1 varchar(30) null
);

You can insert into this table and return the auto-generated serial value by creating a view on top of the table and then inserting into the view.

Code:

CREATE VIEW v_my_table as
SELECT
    my_table_pk,
    some_field,
    some_int,
    other_field1
FROM my_table;

Just re-order the fields so that the serial field is the first one in the view, and then point the TableOutput step to this new view and it works using PostgreSQL 9.3! It may work on older 9.x branches, but have not tested.

Please help me for pentaho connect sql server 2012 express

Session

$
0
0
How to get session cookie value using PDI. Is this possible using PDI components ? If yes how can i do it. I tried hitting the below urls in my browser.

Request url :

https://www.linkedin.com/uas/oauth2/....httpsnow.org/

When we hit this in the browser we will redirect to the grant access page by passing url and password then we can get jsessionid cookie value. But how can i automate this process using PDI ETL Tool.

Can anyone help me how to handle the above steps i.e by passing request_url and grant_access
Attached Images

Pentaho token base authentication

PUC reports show old data even when data is updated?

$
0
0
Hi everyone

I have installed biserver5.1 CE and can view published reports.

However, when I update my data source (on the same server) I see the old (possibly cached) data?

Here's what's happening:

MongoDB is the primary database.
A script populates a mySQL database from that MongoDB database (on the same server) in order to create tables etc to report off.
(I'm not great with the Aggregation Framework etc in MongoDB :))
A cron job runs that populate script.
When the reports are viewed in the PUC, I see old data.

But when I republish the report (designed on my laptop using localhost as a FQDN and published to the biserver ), I see the latest data.

What is wrong?
I would have thought that the report would just soak up all the current data in whatever tables etc it finds.
What do I need to do?

Thanks very much
Brad

Generate date of past sundays using Calculator Step

$
0
0
I need 15 rows as result each will be the date of sundays back from today.

Rather than in query, can I generate it using the Generate rows and calculator steps?

Setting a Column Value to a Variable..

$
0
0
Hello there..

I'm New to PDI, i would like to know how to set a variable with a value from a table in a database. i'm trying to make a execute sql script and im linking it to a set variable task but i cant do it can any one help

Auto Complete Box not working with parameter

$
0
0

Auto Complete Box is not working when i add parameter and set listener to it( if i remove parameter its working fine).
Pentaho CDE 14.07.29 Stable version.

Is this a bug?

Error : SolutionEngine.ERROR_0007 - Action sequence execution failed

$
0
0
Hi,

Using pentaho 5.0.6 with mysql database and getting below error, anybody can help us for the same...

f315aa19-65b0-11e4-9c23-90e2ba5cbc80:SOLUTION-ENGINE:/public/bi-developers/rules/session-region-list.xaction: SolutionEngine.ERROR_0007 - Action sequence execution failed
2014-11-19 04:32:20,160 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] f37aad69-65b0-11e4-9c23-90e2ba5cbc80:SOLUTION-ENGINE:/public/bi-developers/rules/session-region-list.xaction: SolutionEngine.ERROR_0007 - Action sequence execution failed
2014-11-19 04:32:20,627 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] f3c216ac-65b0-11e4-9c23-90e2ba5cbc80:SOLUTION-ENGINE:/public/bi-developers/rules/session-region-list.xaction: SolutionEngine.ERROR_0007 - Action sequence execution failed
2014-11-19 04:32:26,237 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] f719f340-65b0-11e4-9c23-90e2ba5cbc80:SOLUTION-ENGINE:/public/bi-developers/rules/session-region-list.xaction: SolutionEngine.ERROR_0007 - Action sequence execution failed
2014-11-19 04:32:26,654 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] f759e273-65b0-11e4-9c23-90e2ba5cbc80:SOLUTION-ENGINE:/public/bi-developers/rules/session-region-list.xaction: SolutionEngine.ERROR_0007 - Action sequence execution failed
2014-11-19 04:32:27,050 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] f7962826-65b0-11e4-9c23-90e2ba5cbc80:SOLUTION-ENGINE:/public/bi-developers/rules/session-region-list.xaction: SolutionEngine.ERROR_0007 - Action sequence execution failed
2014-11-19 04:32:27,439 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] f7d13559-65b0-11e4-9c23-90e2ba5cbc80:SOLUTION-ENGINE:/public/bi-developers/rules/session-region-list.xaction: SolutionEngine.ERROR_0007 - Action sequence execution failed
2014-11-19 04:33:11,052 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] 11d0544d-65b1-11e4-9c23-90e2ba5cbc80:SOLUTION-ENGINE:/public/bi-developers/rules/session-region-list.xaction: SolutionEngine.ERROR_0007 - Action sequence execution failed
2014-11-19 04:33:11,442 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] 120bd6b0-65b1-11e4-9c23-90e2ba5cbc80:SOLUTION-ENGINE:/public/bi-developers/rules/session-region-list.xaction: SolutionEngine.ERROR_0007 - Action sequence execution failed
2014-11-19 04:33:11,832 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] 12475913-65b1-11e4-9c23-90e2ba5cbc80:SOLUTION-ENGINE:/public/bi-developers/rules/session-region-list.xaction: SolutionEngine.ERROR_0007 - Action sequence execution failed

Thanks,
Chandan

Free webinar on 'How to do Pentaho Version Migration in Just 1-4 Weeks'

$
0
0
Why Migrate to Pentaho 5.x?

Why Migrate?
· Challenges with current / previous versions
· Benefits of new version
Who should Migrate?
· Educate Benefits vs. Needs
How should we Migrate?
· Migration options
· Pentaho migration tool and associated limitations
· GrayMatter migration methodology and related advantages

Accelerate Pentaho migration with GrayMatter in just 1 – 4 weeks. Register for a free webinar on Pentaho migration here http://bit.ly/11o3mCr

Know more at www.pentaho.graymatter.co.in

Saiku analytics

$
0
0
Hi,
I am new to pentaho technology.
Is there any way to open saiku analytics(Downloaded from marketplace in pentaho user console) in new window/new tab to open it independently?????

auto increment in pentaho report designer

$
0
0
hi,
I am new to pentaho technologies.
I want to add serial number in an auto incremental manner in petaho report designer.
please explain with steps
thanks in advance

Is Possible to Use Offline Maps in Pentaho Analysis Report?

$
0
0
Hi friends,


How i can use the maps offline in the Pentaho Analysis Report?
What i need to configure?


Someone have already passed through a situation like this or have some material to indicate me?


Grateful!!:confused:
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>