Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Online Tableau Training with real time projects by Professional IT experts

$
0
0
Tableau Online Training by SUN IT LABS is the best opportunity for students. It is also easy to learn and forms a good entry module for beginners. Tableau Online Training by SUN IT LABS is the best opportunity for students. Tableau Online Training is one of the most in demand data analysis module to build career. All our students were happy and able to find Jobs quickly in USA, India. SUN IT LABS is your one stop solution to learn Tableau Online Training at the comfort of your home with flexible timings. We provide interactive and effective classes for each and every student. So that you can interact with the faculty and you can clarify your doubts.

Some of the Tableau online training Course topics that covered by our professionals:

  1. Tableau Fundamentals
  2. Tableau Advanced
  3. Tableau Server Comprehensive
  4. Administration
  5. Connecting to data
  6. Editing data connection
  7. Creating custom calculations
  8. Aggregation & Disaggregation
  9. Backgrounder
  10. Server security


And many sub topics are there for more details please go through the website.

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Details: USA+1 512 234 3553

Email: Contact@Sunitlabs.com

Web: http://sunitlabs.com/tableau-online-training/

How to execute same transformation more than once(like, for loop task in SSIS )

$
0
0
Hi PDI,
I am in confusion. can you share your experience please.

My requirement is execute same transformation 3 times and write into text file with row number column.
LIKE
Source (table):
id name
1 aaa
2 bbb
3 ccc

Target (file):
id name row_num
1 aaa 1
2 bbb 2
3 ccc 3
1 aaa 4
2 bbb 5
3 ccc 6
1 aaa 7
2 bbb 8
3 ccc 9


Help will appreciated

Passing a filename from a job to a transformation

$
0
0
Good afternoon,

In the attached job, Kettle opens a connection with an FTP site (the credentials have been removed before I attached it), then downloads and unzips any files it finds there.

I then want it to call a transformation for each file, which will load the rowset into a database. (The attached transformation is not the one I intend to use but a simple replacement which will create a dummy file when given an input filename. Its purpose is to show me whether a filename is being passed into the transformation or not).

Unfortunately, absolutely nothing I try results in a filename being passed to the transformation, although Kettle claims that the transformation step in the job is working properly. All other job steps seem to work fine.

Can anyone tell me how this is supposed to be done?

With kind regards,

Will

job.kjbtransformation.ktr
Attached Files

importing csv file with arabic data

$
0
0
hi,
I made a transformation to import csv file to phpmyadmin. My csv file contains arabic words and it is encoded utf-8, but in phpmyadmin i have ???????????.
I tried to import the csv file directly to phpmyadmin and every thing is ok.
Help please

Error with decimal Number on a level property

$
0
0
Hi,

I trying to do a map chart for the university, I'have created a schema with schema workbench, that one have a dimension called Hotel, that have two properties called Latitude and Longitude the values of these are doubles type on my bbdd.

When i do a MDX statement on schemaworkbench, it return a integer number
I tried to pass format_string = '####.0000###' , I also tried with a coma.
I tried to do a property Formatter whit a script in js.

I have no Idea why I can't read a decimal number and it returns a integer number.

My cube is something like that:

HTML Code:


<Schema name="Analisis Reservas" measuresCaption="Medidas">
    <Cube name="Analysis Detail Booking" caption="Analisis Detalle Reservas" visible="true" cache="true" enabled="true">
    ...

        <Dimension type="StandardDimension" visible="true" foreignKey="HOTEL" highCardinality="false" name="Hotel" caption="Hotel">
          <Hierarchy name="HotelZone" visible="true" hasAll="true" primaryKey="HOTEL_CODE" caption="Hotel por zona">
            <Table name="DIM_HOTEL" schema="DCELLSTATISTIC_PRE">
            </Table>
            <Level name="Zona" visible="true" column="ZONE_CODE" nameColumn="HOTEL_ZONE" type="String" uniqueMembers="true" levelType="Regular" hideMemberIf="IfBlankName" caption="Zona">
            </Level>
            <Level name="Hotel" visible="true" column="HOTEL_CODE" nameColumn="HOTEL_NAME" type="String" uniqueMembers="true" levelType="Regular" hideMemberIf="IfBlankName" caption="Hotel">
              <Annotations>
                <Annotation name="Data.Role">
                  <=!=[=C=D=A=T=A=[Geography]=]=>
                </Annotation>
                <Annotation name="Geo.Role">
                  <=!=[=C=D=A=T=A=[location]=]=>
                </Annotation>
              </Annotations>
              <Property name="Lat" column="HOTEL_LATITUD" type="Numeric" caption="Latitud">
                <PropertyFormatter>
                  <Script language="JavaScript">
                    <=!=[=C=D=A=T=A=[return propertyValue.toFixed(4);]=]=>
                  </Script>
                </PropertyFormatter>
              </Property>
              <Property name="Long" column="HOTEL_LONGITUD" type="Numeric" caption="Longitud">
                <PropertyFormatter>
                  <Script language="JavaScript">
                    <=!=[=C=D=A=T=A=[propertyValue.toFixed(4);]=]=>
                  </Script>
                </PropertyFormatter>
              </Property>
            </Level>
          </Hierarchy>
        </Dimension>
        ...
        <Measure name="Total Amount" column="TOTAL_AMOUNT" aggregator="sum" caption="Importe" visible="true">
        </Measure>
        <Measure name="Total Pax" column="TOTAL_PAX" aggregator="sum" caption="Num personas" visible="true">
        </Measure>
        <Measure name="Total Nights" column="TOTAL_NIGHTS" aggregator="sum" caption="Num Noches" visible="true">
        </Measure>
        <Measure name="Total Rooms" column="TOTAL_ROOMS" aggregator="sum" caption="Num habitaciones" visible="true">
        </Measure>
    </Cube>

</Schema>

and the MDX statement i do is:

Code:


WITH
MEMBER [Measures].[Lat] AS ([Hotel].[Hotel].CurrentMember.Properties("Lat")),
FORMAT_STRING = "####.0000###"
MEMBER [Measures].[Long] AS ([Hotel].[Hotel].CurrentMember.Properties("Long")),
FORMAT_STRING = "####.0000###"
SELECT
    {[Measures].[Lat], [Measures].[Long]}
ON COLUMNS,
    {[Hotel].[Hotel].MEMBERS}
ON ROWS
FROM [Analysis Detail Booking]

Anyone know how to solve this problem, why is this happening to me? :confused:
The same thing is happening to me when i tried to do MDX statement on saiku pluggin

Thanks a lot
Constanza

Kettle throwing error while using S3

$
0
0
Hi All,

In a kettle package, I am picking the data from csv file placed at S3 and populating the data into target table. When I run the kettle job (which calls that kettle transformation) from Spoon it is working fine. But when I am trying to execute the same job (kjb) file using Kitech.bat from command prompt it is throwing the following error:

2014/07/21 11:59:29 - RumbaProductDetails - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : org.pentaho.di.core.exception.KettleException:
2014/07/21 11:59:29 - RumbaProductDetails - Unexpected error during transformation metadata load
2014/07/21 11:59:29 - RumbaProductDetails -
2014/07/21 11:59:29 - RumbaProductDetails - Missing plugins found while loading a transformation
2014/07/21 11:59:29 - RumbaProductDetails -
2014/07/21 11:59:29 - RumbaProductDetails - Step : S3CSVINPUT
2014/07/21 11:59:29 - RumbaProductDetails -
2014/07/21 11:59:29 - RumbaProductDetails - at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1212)
2014/07/21 11:59:29 - RumbaProductDetails - at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:634)
2014/07/21 11:59:29 - RumbaProductDetails - at org.pentaho.di.job.Job.execute(Job.java:678)
2014/07/21 11:59:29 - RumbaProductDetails - at org.pentaho.di.job.Job.execute(Job.java:815)
2014/07/21 11:59:29 - RumbaProductDetails - at org.pentaho.di.job.Job.execute(Job.java:500)
2014/07/21 11:59:29 - RumbaProductDetails - at org.pentaho.di.job.Job.run(Job.java:407)
2014/07/21 11:59:29 - RumbaProductDetails - Caused by: org.pentaho.di.core.exception.KettleMissingPluginsException:
2014/07/21 11:59:29 - RumbaProductDetails - Missing plugins found while loading a transformation
2014/07/21 11:59:29 - RumbaProductDetails -
2014/07/21 11:59:29 - RumbaProductDetails - Step : S3CSVINPUT
2014/07/21 11:59:29 - RumbaProductDetails - at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:2931)
2014/07/21 11:59:29 - RumbaProductDetails - at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2813)
2014/07/21 11:59:29 - RumbaProductDetails - at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2774)
2014/07/21 11:59:29 - RumbaProductDetails - at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2759)
2014/07/21 11:59:29 - RumbaProductDetails - at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta

Please let me know what needs to be done to resolve this.

How on earth can I refresh data? (clear cache?)

$
0
0
I have pentaho 5.1 on windows. The database backend to Mondrian is MSSQL server.
I find the only way to refresh data in queries is to restart tomcat.
The Mondrian clear cache command doesn't work. I have the problem in the Saiku plugin and the Pivot4J plugin, which I why I figure I have a problem with the BI Server.

Steps source code

$
0
0
Hi, where can i get the source code of the various steps. I would need to analyze some a modify them. Is it possible to get them ?

SCD implementation in HIVE

$
0
0
Hi friends,

I am trying to maintain dimensions in hadoop cluster. i am using hive and doing the coding part. but i was also trying using the pentaho DI. but was unable to succeed. anyone who have an idea of how can i achieve this with the DI tool i will really appreciate it if you can share. Thank you in advance.

Multiple query results instead one?

$
0
0
Hello.

We are doing one project at the moment and it has multiple query-results to display on the dashboard. And our only solution has been that we create one query component for each query, and that is not quite like the best solution in our opinion. But we only need one result for each query-component that displays one number.

So this configuration at the moment is something like this:

SQL-database server -> Pentaho -> sql over sqlJndi -> Query & Parameters -> Query Component -> Shows it on dashboard at HTML-object

The querys are something like this on sql over sqlJndi:
Code:

select    sec_to_time(max(duration))
from
    events
where
    id = 5
and
    duration <= 3600

So there comes out only one result... As you can see, we could insert here many other components also on the same query. Let's say we also want min and avg from the same duration. We have to do 3 different sql over sqlJndi for this purpose.. So if you can see my worry, is there any solutions? :)

edit:

so this results only 1 number out as result to dashboard:
SQL-database server -> Pentaho -> sql over sqlJndi -> Query (1 output result) & Parameters -> 1 Query Component -> Shows it on dashboard at HTML-object

and something what we want is 3 number out to dashboard from the same query:
SQL-database server -> Pentaho -> sql over sqlJndi -> Query (3 output result) & Parameters -> 3 diff Query Component -> Shows it on dashboard at 3 HTML-objects

Get data into string

$
0
0
Greetings,
I have a dashboard that has static/dynamic text. In the text, dollar amounts that come from SQL statements are needing to go in. Using CDE, how do I get a SQL data source to enter data that goes in a text line in the dashboard?
Thanks.

Running Pentaho BI 5.1.0-752 on debian based server with tomcat7

$
0
0
Hi,

I have previous experience running Pentaho BI 5.0.1 with mysql using http://anonymousbi.wordpress.com/201...llation-guide/ and then moving it from the embedded tomcat to the operation system one.

I have performed the same procedure with Pentaho BI 5.1.0-752:
  1. Change configuration to use mysql instead of the embedded database
  2. Migrate the DDBBs from hypersonic to MySQL
  3. Copy the needed directories ( pentaho pentaho-solutions pentaho-style) to /var/lib/tomcat7/webapps
  4. restart the tomcat7


All starts ok. BUT when I try to add a new MySQL datasource the frontend returns a Internal Server Error. And the following message appears on the logs:

Jul 28, 2014 6:59:52 PM com.sun.jersey.spi.container.ContainerResponse mapMappableContainerException
SEVERE: The RuntimeException could not be mapped to a response, re-throwing to the HTTP container
java.lang.RuntimeException: Database type not found!
at org.pentaho.di.core.database.DatabaseMeta.setValues(DatabaseMeta.java:628)
at org.pentaho.di.core.database.DatabaseMeta.setDefault(DatabaseMeta.java:497)
at org.pentaho.di.core.database.DatabaseMeta.<init>(DatabaseMeta.java:488)
at org.pentaho.database.util.DatabaseUtil.convertToDatabaseMeta(DatabaseUtil.java:30)
at org.pentaho.platform.dataaccess.datasource.wizard.service.impl.ConnectionService.checkParameters(ConnectionService.java:273)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1511)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1442)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391) [32/1835]
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
at org.pentaho.platform.web.servlet.JAXRSPluginServlet.service(JAXRSPluginServlet.java:97)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
at org.pentaho.platform.web.servlet.JAXRSPluginServlet.service(JAXRSPluginServlet.java:102)
at org.pentaho.platform.web.servlet.PluginDispatchServlet.service(PluginDispatchServlet.java:89)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:185)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:87)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:115)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:263)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:55)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:55)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:114)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:70)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.pentaho.platform.web.http.filters.PentahoPathDecodingFilter.doFilter(PentahoPathDecodingFilter.java:33)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:225)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1002)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:310)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
database type with plugin id [Oracle] couldn't be found!

The mysql driver is intalled on the tomcat.

Any Idea what I am doing wrong?

Thank for your time?

Best regards

create dashboard with pentaho community

$
0
0
hi guys
I am new in this forum and pentaho. I would like create graphics(dashboard) with pentaho community, but I don't know if is possible. With edition version is possible and with version commuity? how I can create graphics?


sorry my bad english

Question about processing error records

$
0
0
I am trying to create a reusable transformation which will load error records into an error table.


The fields in my error table are; hostname, parent_job_batch_id, transformation_name, transformation_batch_id, error_code, error_description, error_datetime and error_data. The error_data field will store the entire erroneous record.




The issue i am having is that the format of the error record will be different based on the output table generating the error(s).


I've attatched a ktr to try and help illustrate. Please think of each thread as a seperate transformation. A transformation which will load a person table, and a transformation which will load an address table.


The error fields ouput from my "load_person" transformation will be; first_name, last_name, dob, error_desc, error_code, transformation_name, transformation_batch_id, partent_job_id, hostname and error_datetime.


The error fields output from my "load_address" transformation will be; address1, address2, city, state, zip, error_desc, error_code, transformation_name, transformation_batch_id, partent_job_id, hostname and error_datetime.


I would like to have a transformation which contains the steps from "clean_error_desc" through to "load_error_table" (and of course some input step), and have any of my transformations pass error records to it.



Is it possible to have my error processing transformation accept a dynamic record? I've looked at the Metadata Injection step, but I'm not sure if it will fix my issue.


Any thoughts/suggestions would be greatly appreciated.


Thank you

----------------------------------------------------------------------------
I got this to work using the mapping steps. :)
Attached Files

Filter rows based on text file input

$
0
0
Hi, I have a csv file with 3 columns in it.

CustomerId CustomerNumber CustomerName
1 324 Smith & Co.
2 1453 J&J Mining
4 7654 Jones Enterprise
5 3897 Lubbock & Company


I have another csv/text file, containing customer names as omit values. The number of rows in this file varies

Omit
J&J Mining
Jones Enterprise


I want a transformation that loops through the omit file and delete matching rows from the input file. In the example above, output would be:

CustomerId CustomerNumber CustomerName
1 324 Smith & Co.
5 3897 Lubbock & Company

Ideas? I thought stream lookup might work, but had no luck. Any help appreciated.

CCC Line Chart TrendLine color

$
0
0
Hello!

Is it possible to change the trendline color of a CCC Line chart?

I'm using Pentaho CE 5.0.1. I tried some extension points, tried the trendCoorAxis but no sucess. The problem is that the trend line become very close to real line color, so the client requested to change that.


Thanks!!

Runnig Multiple Pipelines in a transformation

$
0
0
Hi,

I want to run multiple pipe lines in a transformation.

PipeLine_1: table input--> sorter-->text output

pipeline_2: table input--> calculator-->text output

from same transformation. (with out job). If possible tell me the way

NoClassDefFound on TextFileOutput

$
0
0
Hello,
I've developed a transformation that gets some data from a DB, validate it and with a JavaScript step translate it to a String (containing a JSON).

Passing the JSON+fileNamePath to a TextFileOutput that writes the generated JSON to a file as a .log.

When I play this transformation on Kettle it works like a charm but when I deploy it as a .jar, the job goes in error (still doing its job by writing the file correctly... but halting the job in which the transformation is contained!).

I think I'm missing some library but I've almost tried every possible drools library version and still not luck of making this error go away.

The error is the following:

Code:

2014/07/29 09:37:33 - SAVE JSON TO FILE.0 - ERROR (version 5.1-SNAPSHOT, build 1 from 2014-07-09 10.24.48 by tomcat) : Unexpected error
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 - ERROR (version 5.1-SNAPSHOT, build 1 from 2014-07-09 10.24.48 by tomcat) : java.lang.NoClassDefFoundError: org/drools/util/StringUtils
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 -    at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.processRow(TextFileOutput.java:176)
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 -    at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 -    at java.lang.Thread.run(Unknown Source)
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 - Caused by: java.lang.ClassNotFoundException: org.drools.util.StringUtils
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 -    at java.net.URLClassLoader$1.run(Unknown Source)
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 -    at java.net.URLClassLoader$1.run(Unknown Source)
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 -    at java.security.AccessController.doPrivileged(Native Method)
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 -    at java.net.URLClassLoader.findClass(Unknown Source)
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 -    at java.lang.ClassLoader.loadClass(Unknown Source)
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 -    at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 -    at java.lang.ClassLoader.loadClass(Unknown Source)
2014/07/29 09:37:33 - SAVE JSON TO FILE.0 -    ... 3 more

Any thoughts?

Thanks,
Bye



edit: nevermind, found the issue.
I tried everything except the exact library versions (drools-core and drools-compiler) that was contained in the kettle "lib" folder.
Specified them in the pom and now works.
I'm leaving this here if someone gets stuck like me...

Sorry

CCC Charts : Outer Ring to donut/pie (Inner ring to donut/pie)

$
0
0
Hi Leao,

Just would like to give more insight & great look for pie/donut charts and found below from your work out ..

http://jsfiddle.net/duarteleao/wBzGD/

How to make use of this code for pie/donut chart ?

Have tried in PreExecution with no succeed(and/or other code areas).... How can I transfer the code in CDE code execution areas to make it work ?

In enhancement to ring..
1) Is slice colored outer ring possible ? i.e., for instance there are 3 slices on pie chart.. red, green, black.. Is there any possibility to get outer ring in these 3 colors covering the slices.
2) How to do the same inside of donut ? ( 1 outer ring , 1 inner ring in middle slices to donut - Is it possible ?)...

Thank you.

Problem with embedding CDE dashboard into application

$
0
0
Hi everyone!

I have some trouble with embedding dashboard with API:

http://localhost:8080/pentaho/plugin...ile=hello.wcdf

My code:

PHP Code:

<?php$ch curl_init();
curl_setopt($chCURLOPT_URL"http://localhost:8080/pentaho/plugin/pentaho-cdf-dd/api/renderer/getHeaders?solution=public&path=/plugin-samples/pentaho-cdf-dd/tests&file=testCCCv2-II.wcdf");
curl_setopt($chCURLOPT_USERPWD"admin:password");
curl_setopt($chCURLOPT_RETURNTRANSFER1);
echo 
"<html><head>";
echo 
"<script src='js/jquery-1.8.3.js' type='text/javascript'></script>";
$result curl_exec($ch);
echo 
$result;
echo 
"</head><body>";
curl_setopt($chCURLOPT_URL"http://localhost:8080/pentaho/plugin/pentaho-cdf-dd/api/renderer/getContent?solution=public&path=/plugin-samples/pentaho-cdf-dd/tests&file=testCCCv2-II.wcdf");
$result curl_exec($ch);
echo 
$result;
echo 
"</body>";
curl_close($ch);?>

It loading only text without any graphical elements - charts, backgrounds, pictures and etc. Chrome console show me:

Code:

Failed to load resource: the server responded with a status of 404 (Not Found) http://localhost/pentaho/plugin/pent...=1406623709131
Failed to load resource: the server responded with a status of 404 (Not Found) http://localhost/pentaho/plugin/pent...=1406623709131
Uncaught SyntaxError: Unexpected token ILLEGAL localhost:8080/pentaho/api/repos/pentaho-cdf/js/cdf-blueprint-script-includes.js?v=02ad52cc9e2bdb9bbaff5b77c0b62458:2476
Uncaught ReferenceError: BaseComponent is not defined localhost:8080/pentaho/api/repos/pentaho-cdf-dd/js/CDF.js?v=58dc642c5e8d7eb35465f897ab09a806:3
Uncaught ReferenceError: Dashboards is not defined localhost/info2.php:468
Uncaught ReferenceError: Dashboards is not defined localhost/info2.php:473



There are "" symbols in the file cdf-blueprint-script-includes.js. If I connect correct js file with code line:

PHP Code:

echo "<script src='js/cdf-blueprint-script-includes.js' type='text/javascript'></script>"

On loaded page still no images, charts and etc but now I see dashboard structure and in console we have:

Code:

GET http://localhost/pentaho/plugin/pent...=1406623709131 404 (Not Found) info2.php:10
GET http://localhost/pentaho/plugin/pent...=1406623709131 404 (Not Found) info2.php:13
Uncaught SyntaxError: Unexpected token ILLEGAL cdf-blueprint-script-includes.js:2476
CDF: InitInstance 0 cdf-blueprint-script-includes.js:3610
[Lifecycle >Start] Init[0] (Running: 1) cdf-blueprint-script-includes.js:3656
[Lifecycle >Start] render_barChart [cccBarChart] (P: 5 ): preExecution Timing: 0ms since start, 0ms since last event (Running: 1) cdf-blueprint-script-includes.js:3619
[Lifecycle >Start] render_lineChart [cccLineChart] (P: 5 ): preExecution Timing: 0ms since start, 0ms since last event (Running: 2) cdf-blueprint-script-includes.js:3619
[Lifecycle >Start] render_pieChart [cccPieChart] (P: 5 ): preExecution Timing: 0ms since start, 0ms since last event (Running: 3) cdf-blueprint-script-includes.js:3619
[Lifecycle <End  ] render_barChart [cccBarChart] (P: 5 ): postExecution Timing: 2.6s since start, 2.6s since last event (Running: 4) cdf-blueprint-script-includes.js:3619
[Lifecycle <End  ] render_lineChart [cccLineChart] (P: 5 ): postExecution Timing: 2.6s since start, 2.6s since last event (Running: 3) cdf-blueprint-script-includes.js:3619
[Lifecycle <End  ] render_pieChart [cccPieChart] (P: 5 ): postExecution Timing: 2.7s since start, 2.7s since last event (Running: 2) cdf-blueprint-script-includes.js:3619
Uncaught TypeError: Cannot read property 'style' of null translator.js:106
[Lifecycle <End  ] Init[0] (Running: 0) cdf-blueprint-script-includes.js:3666

Trying this in Pentaho 5.0.1 stable and 5.1.snapshot with ctools plugins last stable and trunk versions. I attach two screenshots before and after adding blueprint js file.
I will be grateful for any help!

P.S. Sorry for my bad English :)
Attached Images
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>