Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

How to pass selected components to another drop down (multi selection)

$
0
0
<script type="text/javascript">
$('#bt').click(function(){

$('#param2Select option:selected').appendTo('#dest');

});
</script>
<span class="param1Label"><b>Country</b></span>
<span id="param1Select"></span>
&nbsp;&nbsp;
<span class="param2Label"><b>State</b></span>
<span id="param2Select">
<!--<select id="sada" multiple=""></select> --></span>


<input type="button" id="bt" value='>'>


<select id="dest" multiple="">
</select>



I'm taking 2 built in selections for performing cascading selection(1st one is simple select and next one is multi select).. Up this is fine & no issues.

What I am trying is , I have to move the selected values in 2nd drop down to a custom drop down(3rd drop down). For that I'm writing the above code but unable to get succeed.

In JSFiddle ediot below code is working with out any issue for the same..



<select id='add' multiple>
<option value="volvo">Volvo</option>
<option value="saab">Saab</option>
<option value="opel">Opel</option>
<option value="audi">Audi</option>
</select>
<input type="button" id="bt" value='>'>
<select id="dest" multiple>
</select>

$('#bt').click(function(){

$('#add option:selected').appendTo('#dest');

});


1) Not able to get the id of multi select component (used firebug)..
2) How can I use the id's of pre-given drop downs to map with custom driven drop downs to move the data ?

Thank you :-)

Sadakar

Carte Web Services in v3.2.4

$
0
0
I went through the documentation here: http://wiki.pentaho.com/display/EAI/Carte+Web+Services

I am stuck with using Carte/Kettle 3.2.4. This version is bundled with another product and must be maintained at the same level.

I am trying to use the Carte web services to add some automation to a data integration. Carte slave server is running on a windows host without any repository.

My plan is to allow a web page to call execute a transformation via the carte server.

http://localhost:8081/kettle/execute...s/updatedb.ktr

Unfortunately, when i use this URL, nothing happens. No jobs run, no logs are generated. Not sure if the WS methods are only available in later versions of PDI? I can check the Carte server status ok via http://localhost:8081/kettle/status or run the rows generator sample job but that is it.

Are these other WS features available in 3.2.4?

Cheers
Ryan

HTML Links in Schema Designer Needed

$
0
0
From what I can tell, the only place links can be coded into an analyzer report are on the report after you drag columns over. This makes little sense to me and seems like a half-hearted attempt to check a box on a feature list. If the column is removed and re-added the links will disappear. Plus, your typical user needs the links when they create their report. For example, to comply with HIPAA, it would be nice to display some information on a totally trackable separate report, but access said report from a column on the designer. Adding a link to the report that would then record who accessed what patient's data on what date, etc, to the medical record number column on the schema or some other column, would be ideal.

If this exists in the latest version or some planned version, someone let me know.

Thanks,

Execute R script from a 'shell script' step

$
0
0
I am trying to execute an R script from a kettle job, and I don't really know whether it's a kettle or an R problem I am facing.

I use the shell script step, where I call an R script that begins with the #!/usr/bin/Rscript. I works fine, the only problem is that the argument passed to the R script is not transferred as intended. The argument is a directory path that I would like to set as a working directory for the r script.

Here is what the R script looks like:
Code:

#!/usr/bin/Rscript
args <- commandArgs(TRUE)
setwd(args)

When I run this from a shell, like
Code:

./some_script.R /some/path/
it works fine. But when I start this from the shell script step, it fails with the message: can not change woring directory. I am sure that the argument is ok, when I copy/paste the command from the log to a shell, it runs smoothly.

Any help is appreciated, thanks.

put file with ftp

$
0
0
I have a fairly complicated job stream and I use the ${Internal.Transformation.Filename.Directory} variable to take care of different versions. I store some interim results in plain text files along with the jobs and transformations. A tipical reference to such a file in my solution looks like ${Internal.Transformation.Filename.Directory}/../files/filename.csv.

I found that most of the steps can handle this kind of file reference without a problem, but the 'Put a file with FTP' step fails to parse it properly. Has anyone else came across this problem? Thanks in advance.

Model perspective

$
0
0
Hi:
I want to use Model perspective in kettle, where i can get model perspective plugin?

Pentaho Plugin custom content genration

$
0
0
Hello Community,

Currently I just developing Bi server custom plugin. In that i want to add custom content type called ".pin". For that i have configure plugin.xml and plugin.spring.xml (Find in attachment.

But i got following error while called that file

13:23:05,175 ERROR [GeneratorStreamingOutput] Error generating content from content generator with id [run]
java.lang.NullPointerException
at org.pentaho.platform.plugin.services.webservices.content.PluginFileContentGenerator.createContent(PluginFileContentGenerator.java
:33)
at org.pentaho.platform.engine.services.solution.SimpleContentGenerator.createContent(SimpleContentGenerator.java:57)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutput.generateContent(GeneratorStreamingOutput.java:229)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutput.write(GeneratorStreamingOutput.java:156)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutputProvider.writeTo(GeneratorStreamingOutputProvider.java:58)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutputProvider.writeTo(GeneratorStreamingOutputProvider.java:37)
at com.sun.jersey.spi.container.ContainerResponse.write(ContainerResponse.java:306)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1479)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:111)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:116)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:161)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:83)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:88)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:23
5)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFil
ter.java:265)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter
.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:59)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:112)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:66)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:208)
at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:271)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:606)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Unknown Source)



2) I also don't able see my type icon which i set in plugin.xml file.

Community please help me to solve this issue.
Attached Files

Script for backup of Pentaho BI (such as JCR, dashboards, mondrian schemas)

$
0
0
Hi!


I'm looking at setting up a daily backup of my Pentaho BI 5.x installations. I would appreciate if anyone could suggest me a ready made backup script that backups the important stuff. From the top of my mind I can think of the actual JCR files in /usr/local/biserver-ce/pentaho-solutions/system/jackrabbit but a zipped backup of the whole solution repository, such as what you can get by using the import-export.sh script, would be nice as well. Especially for restoring single files.

I found this one, https://github.com/harisekhon/toolbo...taho-backup.pl, when I searched the net. Anyone that knows if this one is good?

How to format a value inside a formula?

$
0
0
I have a simple formula to show either one filed or another in my report. One of them is String and the other numeric. I would like to format the numeric one:

this is what I have now on a Label type

=if([param];[field1Str];[field2Num])

and I would like a simple formating, kind of ...

=if([param];[field1Str];format([field2Num];"0.000"))

but cannot find any way to do it.

PS. I guess I can print both fields on the same location and set visibility only for one of them as a workaround :), but formatting a number in a formula should be easy!

Thks

Mondrian Schema 3.X Junk Dimension and aggregate tables

$
0
0
Hello, I'm having problems figuring out how to use 4 dimensions ('Junk_Dim1', 'Junk_Dim2', 'Junk_Dim3', 'Junk_Dim4') that uses the same table ('junk_dimensions') when using Aggregate Tables.

What I'd done in the .xml is something like this:

Code:

<AggName name="junk">               
    <AggFactCount column="fact_count"/>
    <AggMeasure column="price" name="[Measures].[Price]"/>
    <AggLevel column="junk_id" name="[Junk_Dim1].[Junk_Dim1 ID]" collapsed="false"/>
    <AggLevel column="junk_id" name="[Junk_Dim2].[Junk_Dim2 ID]" collapsed="false"/>
    <AggLevel column="junk_id" name="[Junk_Dim3].[Junk_Dim3 ID]" collapsed="false"/>
    <AggLevel column="junk_id" name="[Junk_Dim4].[Junk_Dim4 ID]" collapsed="false"/>   
</AggName>


 <Dimension foreignKey="junk_id" name="Junk_Dim1">
    <Hierarchy hasAll="true" primaryKey="junk_id">
        <Table name="junk_dimensions"/>
        <Level name="Whatever1" column="whatever1" type="String" uniqueMembers="true" approxRowCount="6"/>
        <Level name="Junk_Dim1 ID" column="junk_id" type="Numeric" uniqueMembers="true" approxRowCount="182" visible="false"/>
    </Hierarchy>
</Dimension>


 <Dimension foreignKey="junk_id" name="Junk_Dim2">
    <Hierarchy hasAll="true" primaryKey="junk_id">
        <Table name="junk_dimensions"/>
        <Level name="Whatever2" column="whatever2" type="String" uniqueMembers="true" approxRowCount="6"/>
        <Level name="Junk_Dim2 ID" column="junk_id" type="Numeric" uniqueMembers="true" approxRowCount="182" visible="false"/>
    </Hierarchy>
</Dimension>

 <Dimension foreignKey="junk_id" name="Junk_Dim3">
    <Hierarchy hasAll="true" primaryKey="junk_id">
        <Table name="junk_dimensions"/>
        <Level name="Whatever3" column="whatever3" type="String" uniqueMembers="true" approxRowCount="6"/>
        <Level name="Junk_Dim3 ID" column="junk_id" type="Numeric" uniqueMembers="true" approxRowCount="182" visible="false"/>
    </Hierarchy>
</Dimension>

 <Dimension foreignKey="junk_id" name="Junk_Dim4">
    <Hierarchy hasAll="true" primaryKey="junk_id">
        <Table name="junk_dimensions"/>
        <Level name="Whatever4" column="whatever4" type="String" uniqueMembers="true" approxRowCount="6"/>
        <Level name="Junk_Dim4 ID" column="junk_id" type="Numeric" uniqueMembers="true" approxRowCount="182" visible="false"/>
    </Hierarchy>
</Dimension>

But i have the error "Four levels, '[Junk_Dim1].[Junk_Dim1 ID]' and '[Junk_Dim2].[Junk_Dim2 ID]' and '[Junk_Dim3].[Junk_Dim3 ID]' and '[Junk_Dim4].[Junk_Dim4 ID]', share the same foreign column name 'junk_id'.

In the fact table i just have the 'junk_id' column and it works great, that's why i think it has to work on the aggregates.

I also tried using alias but with no success.

JTDS error on windows 2012 r2

$
0
0
Hi, I'm moving PDI application from windows 2008r2 to 2012r2 but some transformations that make insert/update on ms sql stops withh errors.
The PDI version is 5.2.0, db on ms sql 2008 r2
In windows 2008r2 (and also in windows 2003) works perfectly.
The error log is:

Quote:

ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Because of an error, this step can't continue:
ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : org.pentaho.di.core.exception.KettleException:
Error batch inserting rows into table [Costi_Prodotto].
Errors encountered (first 10):




Error updating batch
The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 23 (""): Data type 0xE7 has an invalid data length or metadata length.




at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:342)
at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:118)
at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
at java.lang.Thread.run(Unknown Source)
Caused by: org.pentaho.di.core.exception.KettleDatabaseBatchException:
Error updating batch
The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 23 (""): Data type 0xE7 has an invalid data length or metadata length.


at org.pentaho.di.core.database.Database.createKettleDatabaseBatchException(Database.java:1377)
at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:289)
... 3 more
Caused by: java.sql.BatchUpdateException: The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 23 (""): Data type 0xE7 has an invalid data length or metadata length.
at net.sourceforge.jtds.jdbc.JtdsStatement.executeBatch(JtdsStatement.java:1069)
at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:285)
... 3 more


I try to update JTDS jar but nothing changes

any idea?

mongodb.MongoException$CursorNotFound: cursor 447066068147 not found on server

$
0
0
Hi,

I am using PDI 5.0 Community Edition with MongoDb database .When I fetch the data from mongodb with the help of MongoDB input and Json input component then following error occurred.

->mongodb.MongoException$CursorNotFound: cursor 447066068147 not found on server

This type of error occurred after fetching 5000 rows data .

I am not getting what is the exact problem.

If any one know kindly provide me some solution.

Regards,
Rushikesh

mongodb.MongoException$CursorNotFound: cursor 447066068147 not found on server

$
0
0
Hi,

I am using PDI 5.0 with MongoDB .When I access the data from MongoDB with help of MongoDB input and
Json Input then below error occurred

com.mongodb.MongoException$CursorNotFound: cursor 447066068147 not found on server

These error occurred after 5000 row fetched

Please provide some solution for the same.

MongoDB Input Error

$
0
0
Hi I am using MongoDB Input does anyone know when we fetch data from the same then same error occured.

Denormalizing rows to columns !!

$
0
0
Hi,

Thanks in advance for your support !!

I have the following input

SID Value
BA1 100
BA2 200
BA3 300

and this keeps growing,

What I was expecting in the output is

BA1 BA2 BA3
100 200 300

I have tried using row denormalizer and row flattner but helpless.

Please advice..

How to convert a number padded with blanks?

$
0
0
I am new to Kettle - so bear with me!

I am trying to read from a file with fixed length records/lines. My problem are numeric fields that are padded with blanks, i.e. " 123".
My fields are 4 characters long, but no matter, what pattern I specify (e.g. "0" or "0000" or "###0" or " 0") I invariably get conversion errors like:

Code:

2014/12/11 13:32:09 - Modified Java Script Value.0 - id String : couldn't convert String to number : non-numeric character found at position 1 for value [  2]
What is a legal and working pattern for numbers padded with leading blanks?

ERROR: Unable to open transformation: null

$
0
0
No matter what I try, I can't figure out how to start a transformation inside a job when using a repository and executing the job remotely.

The setup:
The Job: job.png
The Transformation: trans.png

- I'm using Spoon to create the job and the transformation
- Both the job and the transformation are in the same repository
- I am, in Spoon, connected to the repository
- I upload the jobs for remote execution with Spoon to a remote host (same as repository)

The result:
2014/12/11 17:14:29 - test_job - Start of job execution
2014/12/11 17:14:29 - test_job - Handling extension point for plugin with id 'JobTransactionExtensionPlugin' and extension point id 'JobStart'
2014/12/11 17:14:29 - test_job - Handling extension point for plugin with id 'JobBeginProcessingExtensionPointPlugin' and extension point id 'JobBeginProcessing'
2014/12/11 17:14:29 - test_job - Handling extension point for plugin with id 'JobAfterJobEntryExectionExtensionPointPlugin' and extension point id 'JobBeforeJobEntryExecution'
2014/12/11 17:14:29 - test_job - Handling extension point for plugin with id 'JobBeforeJobEntryExectionExtensionPointPlugin' and extension point id 'JobBeforeJobEntryExecution'
2014/12/11 17:14:29 - test_job - exec(0, 0, START.0)
2014/12/11 17:14:29 - START - Starting job entry
2014/12/11 17:14:29 - test_job - Starting entry [test_transformation]
2014/12/11 17:14:29 - test_job - Handling extension point for plugin with id 'JobAfterJobEntryExectionExtensionPointPlugin' and extension point id 'JobBeforeJobEntryExecution'
2014/12/11 17:14:29 - test_job - Handling extension point for plugin with id 'JobBeforeJobEntryExectionExtensionPointPlugin' and extension point id 'JobBeforeJobEntryExecution'
2014/12/11 17:14:29 - test_job - exec(1, 0, test_transformation.0)
2014/12/11 17:14:29 - test_transformation - Starting job entry
2014/12/11 17:14:29 - test_transformation - Opening a transformation by reference with id=c9f8a0a5-ab16-41b8-a153-4d723b400696
2014/12/11 17:14:29 - test_transformation - Starting transformation...(file=null, name=test_transformation, repinfo=null)
2014/12/11 17:14:29 - test_transformation - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Unable to open transformation: null

2014/12/11 17:14:29 - test_transformation - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : java.lang.NullPointerException
2014/12/11 17:14:29 - test_transformation - at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:823)
2014/12/11 17:14:29 - test_transformation - at org.pentaho.di.job.Job.execute(Job.java:716)
2014/12/11 17:14:29 - test_transformation - at org.pentaho.di.job.Job.execute(Job.java:859)
2014/12/11 17:14:29 - test_transformation - at org.pentaho.di.job.Job.execute(Job.java:532)
2014/12/11 17:14:29 - test_transformation - at org.pentaho.di.job.Job.run(Job.java:424)
2014/12/11 17:14:29 - test_job - Finished job entry [test_transformation] (result=[false])
2014/12/11 17:14:29 - test_job - Job execution finished

Any ideas?
Do I need to configure the repository in the remote execution host?
Do I need to configure the job to look in a specific repository?
Attached Images

Why is this fourm moderated?

$
0
0
Everytime I post a new thread, or reply to an existing thread, I receive the following message:

"Thank you for posting! Your post will not be visible until a moderator has approved it for posting."

From there it can take 24-48 hours to hit the forum, and sometimes, never.

As a student trying to learn Kettle, I find this very unhelpful and very frustrating.

Am I being moderated because I'm fairly new here, or are all posts handled this way?

I've never encountered a forum so difficult to use. Can one of the moderators help me out here? If you, as the moderator, choose not to post this, can you at least send me a private message to explain things? Thx.

materialized view in postgresql 9.3

$
0
0
I may ask the question in the wrong place but any way the error message is from mondrian. Recently I downloaded saiku3 community version, my schema file reference to a materialized view in postgresql, but I got the following error message:


mondrian.rolap.RolapSchema$MondrianSchemaException: Table 'kuailian_bi' does not exist in database. (in Table)

In my schema:

<?xml version='1.0'?>
<Schema name="kuailian" metamodelVersion='4.0'>
<PhysicalSchema>
<Table name="kuailian_bi" schema="radius" />
</PhysicalSchema>

The kuailian_bi is the materialized view I created in postgresql.
But I don't get any problem in the saiku version 2.6, can somebody kindly point out a way?

Text Editor Component in Pentaho Dashboards

$
0
0
Hi,

I am trying to use the text editor component in dashboards, but when i click save button in the component the notification panel says "we are sorry! we really tried.."
The path to file is of format: http://localhost:8081/pentaho/notes/1.txt

Requirement:
Save the entered data to a text file in server. The text may not exist initially and on click of save it has to create the file and save the contents. The component should also be able to load already existing contents from the file.
Can anyone let me know if this is achievable using the Text Editor component and/or how can it be achieved.

Thanks,
Sunny
Viewing all 16689 articles
Browse latest View live