Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Copy Files

$
0
0
Hi,

I have configured Hadoop 2.6 in my centos server. I face issue when connecting to hdfs using pentaho.

I face below issue :

Code:

2015/01/06 17:11:10 - Hadoop Copy Files -2015/01/06 17:11:10 - Hadoop Copy Files - Unable to get VFS File object for filename 'hdfs://xxxxxx:9000/WAREHOUSE_DIR' : Could not resolve file "hdfs://xxxxxxx:9000/WAREHOUSE_DIR".
2015/01/06 17:11:10 - Hadoop Copy Files -
2015/01/06 17:11:10 - Hadoop Copy Files - ]
2015/01/06 17:11:10 - Hadoop Copy Files - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : org.pentaho.di.core.exception.KettleFileException:
2015/01/06 17:11:10 - Hadoop Copy Files -
2015/01/06 17:11:10 - Hadoop Copy Files - Unable to get VFS File object for filename 'hdfs://xxxxxxx6:9000/WAREHOUSE_DIR' : Could not resolve file "hdfs://ddd:9000/WAREHOUSE_DIR".
2015/01/06 17:11:10 - Hadoop Copy Files -
2015/01/06 17:11:10 - Hadoop Copy Files -
2015/01/06 17:11:10 - Hadoop Copy Files -      at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:154)
2015/01/06 17:11:10 - Hadoop Copy Files -      at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:102)
2015/01/06 17:11:10 - Hadoop Copy Files -      at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:363)
2015/01/06 17:11:10 - Hadoop Copy Files -      at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:315)
2015/01/06 17:11:10 - Hadoop Copy Files -      at org.pentaho.di.job.Job.execute(Job.java:716)
2015/01/06 17:11:10 - Hadoop Copy Files -      at org.pentaho.di.job.Job.execute(Job.java:859)
2015/01/06 17:11:10 - Hadoop Copy Files -      at org.pentaho.di.job.Job.execute(Job.java:532)
2015/01/06 17:11:10 - Hadoop Copy Files -      at org.pentaho.di.job.Job.run(Job.java:424)

How to resolve vfs issue. If i try with cmd line i can copy file from local to hdfs but through tool it throws an error. With 1.2 version i can do using with PDI but same script with 2.6 it doesn't work

Data type Date is not shown in the Excel output report

$
0
0
Dear all,

I am facing a problem in Excel output report on date field. Suppose I am using a date field in my report and generate a excel output, In the generated excel report date is shown but its data type is shown custom instead of date type. Can any one suggest me how can i solve this problem

Thanks in advance

Reading files under sub-directories in gzip compressed file !!

$
0
0
Hi,

Thanks in advance for your support.

I have csv files under various sub-folders in a gzip compressed file.

For an instance,

All_Dumps.tar.gz

If I extract the above gz file the folder structure would be,

All_Dumps
/Dump01
/File1.csv
/Dump02
/File2.csv
/Dump03
/File3.csv


When I read the All_Dumps.tar.gz file using Text File Input step, I could notice the output is repeated more than once when compared to the total count of lines in the .csv files.

For instance if the total number of lines in File1.csv, File2.csv & File3.csv is 300, I am getting output of 500 lines where I could notice they are repeated.

I have also selected "Include Sub Folders" option while setting the path of input gz file


Please advice.

XML output not getting in CSV

$
0
0
Hi ,

I have done a transformation from xml to csv. But I am unable to fetch values of different parent element in multiple csv as part of normalization. I get a blank csv file. Kindly help me out with a solution for this attached my ktr and xml. thanks in advance.
Attached Files

intermittent Error When running Kitchen in a batch file

$
0
0
Sometimes when I run a batch file that executes kitchen I receive the following error in my log file:
child index = 1, logging object : org.pentaho.di.core.logging.LoggingObject@b27b9b5 parent=b0e47214-3b60-4278-bd8d-b741fc1f38f9.
When I rerun the job without modification it runs successfully.
My source table is in ORACLE 11.2 and my target table is also in ORACLE 11.2.
Has anyone else seen this problem. My transformation is very simple table input, table output and the output table is truncated. I can also say that it happens with both small and large amounts of data. From 10 to 1.1 million rows.
The unique indexes are the same on both the source and target tables.

how to agregate mutliple rows and copy them into colums (not rows)

$
0
0
Hye.

I would like to read an excel file containing on each row 2 colums (A & B) :
"student name " and its "classroom"

Firstly, in my transformation I would like to agregate the "classrooms" and put them into an another excel file in colums E,F....
My issue : I don't know how to "transpose" the classrooms from "3A 4B 6D 5E" and not "3A" in line 1,"4B" in line2,etc ...

Secondly, I don't know how to have under the colon classroom (from E to H) the students how belong to each classroom

examples

3A (in column E)
student1
student2

4B (in column F)
student3

I also join and excel file which can explain my aims visually.

Thanks for helping me

PDI 5.2.0.0
Attached Files

detecting physical records deleted

$
0
0
Hi, i am working with Pentaho Kettle Data Integration becauase I redesign my database and I need migrate data to other database. My question is, how do I detect physical deleted at the base A and then delete the base B?

I'm migrating data by little and in real time.

Sorry, I dont speak english very good.
Thanks.

doQuery: null

$
0
0
When an attempt is made to open a dashboard the page provides the "Error processing component" message and the log has an "ERROR [pt.webdetails.cda.CdaContentGenerator] doQuery: null java.lang.NullPointerException" error message (full stack trace below).

Has anyone ever come across this error message? This is occurring at a client site where I don't have direct access to the environment but in attempts to "break" the same dashboard in my environment I have yet to receive this same error message.

This is a Windows platform with Pentaho 4.5, C-tools version 14.03.07, MonetDB Jan-2013 SP1 database.

Thanks,
Matt

ERROR [pt.webdetails.cda.CdaContentGenerator] doQuery: null
java.lang.NullPointerException
at org.pentaho.platform.repository.solution.dbbased.RepositoryFile.getSolutionPath(RepositoryFile.java:169)
at pt.webdetails.cpf.repository.pentaho.PentahoLegacySolutionAccess.hasAccess(PentahoLegacySolutionAccess.java:272)
at pt.webdetails.cda.settings.CdaFileResourceLoader.hasReadAccess(CdaFileResourceLoader.java:52)
at pt.webdetails.cda.settings.SettingsManager.getCdaSettings(SettingsManager.java:100)
at pt.webdetails.cda.settings.SettingsManager.parseSettingsFile(SettingsManager.java:167)
at pt.webdetails.cda.CdaCoreService.doQuery(CdaCoreService.java:81)
at pt.webdetails.cda.CdaContentGenerator.doQuery(CdaContentGenerator.java:71)
at sun.reflect.GeneratedMethodAccessor149.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at pt.webdetails.cpf.SimpleContentGenerator.invokeMethod(SimpleContentGenerator.java:329)
at pt.webdetails.cpf.SimpleContentGenerator.createContent(SimpleContentGenerator.java:160)
at sun.reflect.GeneratedMethodAccessor132.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.pentaho.platform.web.http.context.WebSpringPentahoObjectFactory$a.invoke(SourceFile:288)
at $Proxy30.createContent(Unknown Source)
at org.pentaho.platform.web.servlet.GenericServlet.doGet(GenericServlet.java:261)
at org.pentaho.platform.web.servlet.GenericServlet.doPost(GenericServlet.java:80)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:643)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:102)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:84)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.SecurityStartupFilter.doFilter(SecurityStartupFilter.java:103)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:169)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.AbstractProcessingFilter.doFilterHttp(AbstractProcessingFilter.java:278)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.logout.LogoutFilter.doFilterHttp(LogoutFilter.java:89)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.HttpSessionReuseDetectionFilter.doFilter(HttpSessionReuseDetectionFilter.java:134)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at com.pentaho.ui.servlet.SystemStatusFilter.doFilter(SourceFile:72)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:113)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:879)
at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:617)
at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1760)
at java.lang.Thread.run(Unknown Source)

Hive Issue

$
0
0
Not directly a PDI issue, but indirectly since it seems PDI requires an older version of Apache Hadoop. I am running PDI 5.2 GA, Apache Hadoop 0.20.205.0, Apache Hive 0.13.1.

From my research, PDI supports up to Apache Hadoop 0.20.x, the greatest Apache Hive release that will work with 0.20.x is 0.13.1.

When I attempt use the Hive shell, I get the error "Exception in thread "main" java.lang.NoSuchFieldError: ALLOW_UNQUOTED_CONTROL_CHARS..."

From what I've read, the workaround is to upgrade Apache Hadoop, but I don't think that this possible given PDI's lack of support for later versions.

So my question is, has anyone found a version of Apache Hive that will work with Apache Hadoop 0.20.x?

If not, what flavors/versions of Hadoop/Hive have been successfully used with PDI by others?

Value of Commit size in Table output

$
0
0
In my transformation I use Commit size = -1. What this value (-1) mean?

Thanks and sorry for my english!

How to specify a second line in a CDE chart?

$
0
0
Hi, I've got a query with many columns that is being used to drive a CDE dashboard. I'm using a couple of those columns to populate a line chart. In the chart pre-execution, I've got the following script; this pulls the category (the date) from the first column, and then the line values are pulled from the 44th column of data. This works fine.

function chart1() {


this.chartDefinition.readers = [
{names: 'category', indexes: 0 },
{names: 'value', indexes: 44}

];

My question is, how do I display a second line on this chart (from the 45th column of data)? I've tried the following, but it doesn't work.

function chart1() {


this.chartDefinition.readers = [
{names: 'category', indexes: 0 },
{names: ['value','value2'], indexes: [44,45]}

];

Suggestions are much appreciated; thanks!

Analyzer Date Format Incorrect Output Format

$
0
0
I'm trying to format the date output and no matter what I do it doesn't output correctly. I'm using the Pentaho Report Analyzer and I've trying using Annotations and the Analyzer Date Format with no luck. Can anyone help me solve this problem?

My input:

DATE_REG
2014-01-17
2014-01-18

But my output is this
DATE_REG
2014-01-17 00:00:00.0
2014-01-18 00:00:00.0
2014-01-19 00:00:00.0
2014-01-20 00:00:00.0

How to edit an existing Dashboard?

$
0
0
Hello,
I am running Pentaho EE 5.1.0 and the latest version of CDE available on Marketplace (14.12.10.1 STABLE) -- I created a dashboard and I cannot seem to find a way to open and edit the dashboard. I have tried all available files but when i select open they open as the content and do not open in CDE editor.

Can someone provide me some hints on what may be wrong. Appreciate all the help.



Thanks,

Redshift Query does not return data

$
0
0
Hello,
I am facing a problem with Pentaho 5.2 CE installed on Windows Server 2012 R2 Standard (64-bit) using a MySQL database Repository.
I am connecting to Amazon Redshift Database via Spoon Transformation Table Input Step. The connection test is successful, but query preview, Get select statement or Explore DB all options go on interminably - There is no error but also the execution does not complete. If the transformation is run with a simple select * SQL, on 26-row table it stays in Running state till the time the the transformation is not stopped, even after this, the Table input step remains in Halting state.

JRE - I have tried JRE7 both 32-bit as well as 64-bit. Also JRE8 , but this causes an error in other transformations that have Excel output.
Postgresql driver - postgresql-8.4-703.jdbc4.jar , postgresql-8.4-703.jdbc3.jar, postgresql-9.3-112-jdbc4.jar (although Amazon Redshift does not recommend this)

Please help.
Additional Inputs
SQL workbench/J can execute the query on the server with the postgresql-8.4-703.jdbc4.jar
The Pentaho Table output step is able to Write to the Database successfully

unable to call xaction file in cde dashboard

$
0
0
Hi Group,

I am calling the xaction file in cde dashboard by using xaction component.

Version:
Pentaho:5.1 CE
C-tools:14.12.10.1 stable

The xaction file is working fine with out calling in dashboard.when i am calling the xaction file in dashboard then i am getting this error:

ERROR [org.pentaho.cdf.xactions.ActionEngine] org.pentaho.platform.api.repository2.unified.UnifiedRepositoryException: exception while getting file with path "Charts/BubbleChart.xaction"

Reference number: 8e43c6e4-b008-4ea0-b386-f8a7b157bda9

My xaction file is located at demo/Charts/BubbleChart.xaction

Also i have define the proper path in xaction component.
Solution:demo
Path:Charts
Action:BubbleChart.xaction

can you please guide me what i m doing wrong here .

Regards
Sumit

Difference between Interactive reports, Analysis reports and Static Reports

$
0
0
Hi All,

What are the differences between Interactive reports, Analysis reports and Static Reports?
Is interactive report good for large volume of data like millions of rows?
Can anybody list the features available in each of the reports.


Regards,
Lourdhu.A

How to enable DB transactional mode through the API

$
0
0
Hi,

I have an UDJC step through which I automatically create and execute a transformation based on a certain set of parameters. Could someone tell me how to enable the transformation transactional mode through the Java API (i.e. in the code of the UDJC step)? I'm talking about the "Make the transformation database transactional" checkbox in the "Miscellaneous" tab of the "Transformation properties" dialog.

I was looking into the JavaDoc but couldn't find anything that would suggest such option. I've checked the Trans and TransMeta classes but I may have missed something.

Thanks.

Component listener always empty

$
0
0
Hi,

I am having trouble with getting selects working for my data queries.

I have added parameters to my query, created a selector with the same parameter name and also a table component with the parameter.

However, I cannot select the listener for the component since I only get the 'Select All' option and my selector is not working.

Can I get some advice on this, or a link to a tutorial that can explain it?

Regards, Laszlo

http://localhost:8099 does not work

$
0
0
hi everyone,
i'm a student from italy and recently i installed pentaho ce,after i searched on internet how to start the biserver and the indications was to open the terminal(i'm working on mac osx 10.10.1 with java 1.8) and in the directory biserver-ce launch the command ./start-pentaho.sh and after that open the administration console through a web browser(firefox for me) and typing http://localhost:8099 and logon with the default credential,but the connection fail and nothing appear.

Please can someone help me to solve it because i have a homework to release.Thanks

Installing Mondrian in Tomcat to use as XMLA provider

$
0
0
Hey guys

So I need to use mondrian as a XMLA provider, but I am having some trouble with the installation.

In the documentation it says to explode mondrian.war into tomcat folder, but I can't find that in the new releases.

What should I do instead?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>