Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

CDA & Query schechuled refresh

$
0
0
Hello.

As my topic says, I am wondering how can I schedule a query to be run or refreshed let's say in every 30 second? I know that this can be a silly question, but this is all new to me so bare with me :-)
Do I have to do some coding to all the querys that I want to be refreshed every 30 seconds, or is there some global CDA-timing-table that can refresh all my querys the same time?

If you have a hunch or a solution(s) regarding this, please reply! All the help is needed atm.

Porting CDE dashboards from 4.8 to Pentaho 5.1

$
0
0
Hi all,
I have not found clear information on how to transfer cde objects from 4.8 to 5.1
I think a "how-to" guide would be of great help for many users who have chosen C-Tools for their dashboards.
Thank you

Confusion over database connection scope - local Spoon versus Data Integration server

$
0
0
Hi all,

We have installed Pentaho Data Integration on a Linux server running on an Amazon EC2 instance and I'm using Spoon running on my Windows PC.

On starting Spoon, I login to the repository which is running on the EC2 server using the corresponding IP address for the URL i.e. I know I am connecting to the EC2 repository and not a local copy.

I am having trouble with the database connections as they seem to be emanating from the local Spoon client rather than the Data Integration server running on EC2.

For example, if I connect to the di_jackrabbit database using 'localhost', it is connecting to a local version of the database on my Windows PC rather than the version running on the EC2 server.

The key issue is that the database I need to connect to is configured for access (firewall and JDBC access permissions) from the EC2 server and not my Windows PC so I am getting 'Connection timed out errors' when I try to connect.

Thanks!

Michael Stone

Mondrian Error:Hierarchy appears in more than one inde

$
0
0
Hello everyone

I'm trying to get working the following query:

Code:

WITH                   
MEMBER Measures.ECns AS ([Measures].[Amount], [Indicator].[Energy Consumption])                   
MEMBER Measures.ECost AS ([Measures].[Cost], [Indicator].[Energy Consumption])                   
                   
SELECT                   
NON EMPTY {[Measures].[EnergyCons], [Measures].[EnergyCost]} ON COLUMNS,                   
NON EMPTY {[Time.Semester_Quarter_Time].[All Time.Semester_Quarter_Times].Children} ON ROWS                   
FROM [Cube]                   
WHERE CROSSJOIN([Geo].[Americas].[North America],[Center].[root].Children)  * {[Time.Semester_Quarter_Time].[Year].[2011]:[Time.Semester_Quarter_Time].[Year].[2013]}

But I'm getting the following error:
Quote:

MondrianException: Mondrian Error:Hierarchy '[Time.Semester_Quarter_Time]' appears in more than one independent axis.
I think that I'm using different levels on the rows and where axis ([All Time.Semester_Quarter_Times] and [Year]) but I cannot make it work. Am I doing something wrong or it has anything to do with this http://forums.pentaho.com/showthread...different-axes ?

Is it an isolated case for Mondrian 3.X? The following query works flawlessly on SSAS:

Code:

SELECT [Measures].[Recuento] on 0,
Hierarchize({[Tiempo].[Ano].Members}) on 1
FROM [DWH Pruebas]
WHERE [Tiempo].[Calendario].[Ano].&[2013]


Thanks in advance!

Adminstration : Users and Roles in 5.0 CE

$
0
0
Hi Forum,

I have put my CDE project folder under Home folder of "Folders" Panel. Now what I would like to do is give access to this folder(access to dashboards) for specific users only..

For doing this I have done below.

Created two users : Let's say : user1 and user2
Created two roles : Let's say : role1 and role2

role1 Operation permission is : Read content
role2 Operation permission is : Create content

I have shared the CDE project folder for user1 in it's properties .. ( though it is asking for roles and users I have given user1 i.e., assuming user1 associated role is role1).

Now I have logged out from the server( logged in with Admin & password) and logged in with user1/user1... and then trying to access the CDE project.. As I log-in with user1/user1 the server has to show only user1 folder under Home but it is showing all the other users(Admin,suzy,tiffany,pat,user1,user2).. And user1 operation is Read content but I can see Create New Option when I move to HOME or File->New is accepting for new creation which should not happen as it is Read content.

On the other case, I have loggedin using user2 which is for create content ..

How can I make my CDE dashobards to view for user1 and create for user2 ?

Thank you in advance

Is there any property to make the CCC charts to view in 3D mode ?

$
0
0
Hi Forum,

Is there any property to set to make the existing charts pie/bar charts to viewable in 3D mode ?

3D CCC property.

Setting data source programatically for integration testing

$
0
0
I'm trying to set up automatic (Java) integration tests for a suite of jobs and transformations. I would like to do this using an in-memory HSQLDB instance, and would like the configuration to be flexible so I can swap out the configuration and run Kettle components from a servlet against a different database. After googling and experimenting I'm getting close, but haven't figure out how to get my Kettle runner to use the provided data source.

The data source is configured as a Spring bean:

Code:

<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource"  destroy-method="close">
    <property name="driverClassName" value="org.hsqldb.jdbcDriver"/>
    <property name="url" value="jdbc:hsqldb:mem:weeklyreport"/>
    <property name="username" value="sa" />
    <property name="password" value="" />
  </bean>

This data source is injected into a Spring bean that I use to run Kettle transformations and jobs:

Code:

public void setDataSource(DataSource dataSource) {
    this.dataSource = dataSource;
    DataSourceProviderFactory.setDataSourceProviderInterface(new DataSourceProviderInterface() {
      @Override
      public DataSource getNamedDataSource(String s) throws DataSourceNamingException {
          return dataSource;
      }
  });
}

From the example I saw at http://devno.blogspot.de/2013/03/pen...sion-with.html I thought that would be enough - but when I run the transformations from my test I see that they use the same PostgreSQL database they're configured to use. I went through the API docs, but couldn't find any way of making a transformation or job use this data source.

Thanks,

John

How to join input tables (compound condition)

$
0
0
Hi !
I have 2 table inputs with columns: ID, Count.
I want to join them:

SELECT *
FROM Table1 T1
JOIN Table2 T2
ON T2.Count IS NULL or T1.Count > T2.Count


How can I do this in Kettle?

Thanks.

Getting an error while using SQL Server SP as data source in Pentaho admin

$
0
0
Experts,

I want to use SQL Server SP in data source creation but when I trying to execute (exec <procudure name> ) ,it throws error.
Successfully executed select statement (select * from schema.tablename)

ERROR in popup

DataSourceServiceImpl.ERROR_009
Query Validation failed.Query Validation failed{0}.


Please let me know if I am missing anything.

Thanks

Export Dashboard having multiple charts in single file(CDE)

$
0
0
Hi,

We have various component for export with limited functionality as per my requirement.
  • ExportPopupComponent - can take only 1 chart component name & exports chart in png(image) only and data in excel.
  • Export Button - can take only 1 chart component (much similar to above)

I have created dashboard containing 4 different bar charts) in CDE
Is there any way to export entire dashboard in single pdf/image file?

Thanks

I found GC overhead limit exceeded in pentaho reporting

$
0
0
java.lang.OutOfMemoryError: GC overhead limit exceeded
at com.mysql.jdbc.MysqlIO.nextRowFast(MysqlIO.java:2145)

export button to excel, csv

$
0
0
I'm generating static HTML reports through the PDI integration tool, developing the reports in Pentaho report designer (3.9.4). My reports have several rows of data which when viewed on the BI server can be exported to csv or excel through the portal. I'm looking for that same functionality built into my prpt reports using the report designer. My reports will be viewed outside of the BI server and the user must still have the ability to export the results. I have been unsuccessful in finding any resources to add this functionality to the report (.prpt) and using Ctools is not an option. Does anyone have a solution or a resource that can help?

Repository Export and Import

$
0
0
I using Kettle PDI 5.0.1-stable. Currently we are using a database repository using PostgreSQL. I would like to export the repository from the Postgres DB repository and load it to a file system repository. I have tried using the "Repository Export.ktr" and "Repository Import.ktr" samples to accomplish this task. The Export appears to work as expected. But when I try to import the data using the import KTR I get the messages below, it appears that the process locates the first folder/transformation and then errors. I looked at the distribute field in the XML file and it is "<distribute>Y</distribute>".
:


Excerpt from log file.
>>
2014/07/07 11:32:16 - RepositoriesMeta - Reading repositories XML file: /home/cmuser/.kettle/repositories.xml
2014/07/07 11:32:19 - Repository import - Importing repository objects from an XML file
2014/07/07 11:32:20 - Repository import - Import objects from XML file [/opt/cm/data/archive/dev_repository_export_Jul072014.xml]
2014/07/07 11:32:20 - Repository import - Importing objects from file "/opt/cm/data/archive/dev_repository_export_Jul072014.xml"
2014/07/07 11:32:20 - Repository import - Asking in which directory to put the objects ...
2014/07/07 11:32:20 - Repository import - Importing transformation 1 : clik
2014/07/07 11:32:20 - Repository import - Locking repository database, trying to get exclusive access...
2014/07/07 11:32:20 - Repository import - Saving file /tmp/clone_501/clik
2014/07/07 11:32:20 - Repository import - Handling old version of transformation (if any)...
2014/07/07 11:32:21 - Repository import - Could not save transformation #1 as "clik": org.pentaho.di.core.exception.KettleException:
2014/07/07 11:32:21 - Repository import - Unable to save step info to the repository for id_transformation=475
2014/07/07 11:32:21 - Repository import -
2014/07/07 11:32:21 - Repository import - Error inserting/updating row
2014/07/07 11:32:21 - Repository import - ERROR: column "distribute" is of type boolean but expression is of type character varying
<<

Thank you,
James

Manual Install CDF, CDA, CDE.

$
0
0
Hello all,

Can anyone provide working links to install CDF, CDA, CDE manually? I can't seem to get marketplace to work correctly so i have to do a manual install. On that note - anyone experiencing a blank page when connecting to market place?

Also, any suggestion as to why i would get this error (See attachment) when launching the CDE dashbord?
This link doesn't seem to be working correctly - http://pedroalves-bi.blogspot.com/20...lable-cdf.html


Many thanks for your time.

BI - 5.0
Windows
PostgreSQL
Attached Files

Pentaho 5.1-STABLE artifacts?

$
0
0
My team started upgrading our environment to 5.1 as soon as it was announced 2 weeks ago, but we are developing an application with Pentaho embedded in it and have been waiting for the 5.1 artifacts to be posted to Pentaho's Artifactory server. As of this post, only 5.1-SNAPSHOT and 5.1.preview.506 are posted. When will the 5.1-STABLE artifacts be posted? We are ready to upgrade our app, but prefer to let Maven handle dependencies.

Errors when trying to do PGP decrypt

$
0
0
I am getting an error when trying to decrypt a file using the "Decrypt files with PGP" step. This works fine when running from a command prompt but doesn't when using this step. The error is posted below. Is this the same error that is addressed in http://jira.pentaho.com/browse/PDI-7885 ?


2014/07/07 12:51:23 - Spoon - Starting job...
2014/07/07 12:51:24 - KJ_UBI_TW decrypt - Start of job execution
2014/07/07 12:51:24 - KJ_UBI_TW decrypt - Starting entry [Decrypt files with PGP]
2014/07/07 12:51:24 - Decrypt files with PGP - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : There was an error moving file [file:///C:/TWTesting/NYCM_TW_UB_Vehicles.csv.asc] to [file:///C:/TWTesting/Decrypted/NYCM_TW_UB_Vehicles.csv.asc] : [
2014/07/07 12:51:24 - Decrypt files with PGP - org.pentaho.di.core.exception.KettleException:
2014/07/07 12:51:24 - Decrypt files with PGP - IO exception while writing running command!
2014/07/07 12:51:24 - Decrypt files with PGP - Cannot run program "C:\Program": CreateProcess error=2, The system cannot find the file specified
2014/07/07 12:51:24 - Decrypt files with PGP -
2014/07/07 12:51:24 - Decrypt files with PGP -
2014/07/07 12:51:24 - Decrypt files with PGP - IO exception while writing running command!
2014/07/07 12:51:24 - Decrypt files with PGP - Cannot run program "C:\Program": CreateProcess error=2, The system cannot find the file specified
2014/07/07 12:51:24 - Decrypt files with PGP -
2014/07/07 12:51:24 - Decrypt files with PGP - ]
2014/07/07 12:51:24 - Decrypt files with PGP - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Success condition was broken. We have 1 error(s)
2014/07/07 12:51:24 - KJ_UBI_TW decrypt - Finished job entry [Decrypt files with PGP] (result=[false])
2014/07/07 12:51:24 - KJ_UBI_TW decrypt - Job execution finished
2014/07/07 12:51:24 - Spoon - Job has ended.

Basics/Troubleshooting Oracle Bulk Loader

$
0
0
Hello All-

I am trying to set up and handle the initial configuration of PDI (kettle) to almost exclusively load large data files (fixed width, csvs, 0-20GB file size) into our Oracle data warehouse. I am planning on setting it up to run locally on a laptop with a scheduler and backing up the repository to a network drive. The machine itself Windows XP 32 bit (I know, I know-my employer).

I followed the setup instructions to get the correct JRE installed and created the environment variable. The application folder is currently residing on my desktop. When I run spoon.bat everything appears to initialize properly (although briefly the command console flashes and I see DEBUG proceeding what appear to be the path to JRE, etc.). I have been able to setup a basic no frills job that replicates the basic workflow (Initialize->Get a file with SFTP->Unzip/Move/Rename (To data archive on network drive)->Oracle Bulk Loader (Transformation Step)->End).

I have managed to get it to the point where everything appears to be working except the Oracle Bulk Loader step. PDI connects to SFTP, moves the file to the data archive, handles the file management steps, the Oracle bulkloader step runs (indicates successfully), but instead of the job ending it starts repeating itself for some reason. When I run the transformation step by itself, it just switches from idle to finished without loading anything. I have tested the connection to the dev environment and the references to all the files appear correct (Target Schema, Target Table, sqlldr file and path, data file, log file, bad file, and a selection of fields to load to start). When I test the oracle connection via ODBC (not using JDBC) it says that it is connected.

Oracle Bulk Loader.jpg

I am data analyst/consultant/scientist by training and not particular familiar with the ETL/data ops world, but inherited a real convoluted mess of a monthly data load process so hence trying to corral it with a properly managed ETL appliance. If anyone would be able to provide me with an oracle bulk loader example or help me troubleshoot, it would be greatly appreciated!

Thanks,
John
Attached Images

Unable to create the database cache

$
0
0
Hi All,
I had the community edition working fine for a few months. All of a sudden it crashed and now i am unable to restart it. I am using the spoon.sh script to start up spoon.
Here is the error message:

2014-07-07 11:19:33.422 java[5378:230b] [Java CocoaComponent compatibility mode]: Enabled
2014-07-07 11:19:33.453 java[5378:230b] [Java CocoaComponent compatibility mode]: Setting timeout for SWT to 0.100000
2014/07/07 11:19:38 - General - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Error starting Spoon shell
2014/07/07 11:19:38 - General - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : java.lang.RuntimeException: Unable to create the database cache:
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General - Couldn't read the database cache
2014/07/07 11:19:38 - General - org.pentaho.di.core.exception.KettleFileException:
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General - : Unable to read row metadata from input stream
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General - Unable to locate value meta plugin of type (id) 131072
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General - Unable to locate value meta plugin of type (id) 131072
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General - org.pentaho.di.core.exception.KettleFileException:
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General - : Unable to read row metadata from input stream
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General - Unable to locate value meta plugin of type (id) 131072
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General - Unable to locate value meta plugin of type (id) 131072
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General -
2014/07/07 11:19:38 - General - at org.pentaho.di.core.DBCache.getInstance(DBCache.java:247)
2014/07/07 11:19:38 - General - at org.pentaho.di.ui.spoon.Spoon.loadSettings(Spoon.java:7066)
2014/07/07 11:19:38 - General - at org.pentaho.di.ui.spoon.Spoon.init(Spoon.java:763)
2014/07/07 11:19:38 - General - at org.pentaho.di.ui.spoon.Spoon.createContents(Spoon.java:9098)
2014/07/07 11:19:38 - General - at org.eclipse.jface.window.Window.create(Window.java:426)
2014/07/07 11:19:38 - General - at org.eclipse.jface.window.Window.open(Window.java:785)
2014/07/07 11:19:38 - General - at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9129)
2014/07/07 11:19:38 - General - at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:638)
2014/07/07 11:19:38 - General - at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2014/07/07 11:19:38 - General - at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
2014/07/07 11:19:38 - General - at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
2014/07/07 11:19:38 - General - at java.lang.reflect.Method.invoke(Method.java:597)
2014/07/07 11:19:38 - General - at org.pentaho.commons.launcher.Launcher.main(Launcher.java:151)
stopping

Any suggestions?

Thanks,
Karthick

Problemas al exportar PDF

$
0
0
Buenas, les comento un problema con el que me encontré. Yo ejecuto pentaho report designer en Ubuntu 12.04. Cuando previsualizo el reporte en PDF y lo guardo en el mismo formato, en mi PC lo veo bien. En otra maquina que corre OS X 10.8.4 tambien visualizo sin problemas el archivo. Pero en las maquinas que corren windows XP y seven, algunos text-field no aparecen. Al abrir el archivo con Adobe Reader sale el siguiente mensaje:
"Hay un error en esta pagina; puede que Acrobat no la muestre correctamente. Diríjase a la persona que creo el documento PDF para resolver el problema."

Probe instalando Sumatra PDF pero obtuve el mismo problema. Cambie en File->Configuration...output-pageable-pdf el encoding a iso-8859-1 pero obtuve el mismo resultado. Alguna idea al respecto?

Scheduled xaction generates one html file every time it runs

$
0
0
Im using Pentaho 5 ce bi-server. I've been using 4.8 version previously and created some xaction that were running automatically once per day by schedule. That works fine in 4.8. Now i've migrated those xaction's and scheduled them in bi-server version 5. They work ok both ways (running manually and scheduled) but the problem is that every time that any xaction run auto by schedule, it creates an html file with the name of the xaction file.
For example, i've a xaction named LoadLilo and every it runs by schedule create an LoadLilo.html. Because of this, in the same folder where LoadLilo xaction file resides, i got many html's (LoadLilo.html, LoadLilo(2).html, LoadLilo(3).html, LoadLilo(4).html, LoadLilo(5).html) and keep appearing as the scheduler run the xaction.

Is there any way to avoid this? Maybe in the creation of the xaction using Pentaho Design Studio there is a way to disable saving the resultant html...

Any help?

PS: when i open any of this html files, i see the result page containing "Action Successfull"

Thanks and sorry for my english.
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>