Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Creating subtables in MongoDB

$
0
0
I have two PostgreSQL tables with the following data:
houses:
Code:

-# select * from houses; 
  id |    address
 ----+----------------
  1 | 123 Main Ave.
  2 | 456 Elm St.
  3 | 789 County Rd.
 (3 rows)

and people:
Code:

-# select * from people;
  id | name  | house_id
----+-------+----------
  1 | Fred  |        1
  2 | Jane  |        1
  3 | Bob  |        1
  4 | Mary  |        2
  5 | John  |        2
  6 | Susan |        2
  7 | Bill  |        3 
  8 | Nancy |        3
  9 | Adam  |        3
(9 rows)

In Spoon I have two table inputs the first named House Input with the SQL:
Code:

SELECT  id , address
FROM houses
ORDER BY id;

The second table input is named People Input with the SQL:
Code:

SELECT  "name" , house_id
FROM people
ORDER BY house_id;

I have both table input's going into a Merge Join that uses House Input as the first step with a key of id and People Input as the second step with a key of house_id.

I then have this going into a MongoDb Output with the database demo, collection houses, and Mongo document fields address and name. (As I am expecting MongoDB to assign the _id).

When I run the transformation and type db.houses.find(); from a Mongo shell, I get:

Code:

{ "_id" : ObjectId("52083706b251cc4be9813153"), "address" : "123 Main Ave.", "name" : "Fred" }
{ "_id" : ObjectId("52083706b251cc4be9813154"), "address" : "123 Main Ave.", "name" : "Jane" }
{ "_id" : ObjectId("52083706b251cc4be9813155"), "address" : "123 Main Ave.", "name" : "Bob" }
{ "_id" : ObjectId("52083706b251cc4be9813156"), "address" : "456 Elm St.", "name" : "Mary" }
{ "_id" : ObjectId("52083706b251cc4be9813157"), "address" : "456 Elm St.", "name" : "John" }
{ "_id" : ObjectId("52083706b251cc4be9813158"), "address" : "456 Elm St.", "name" : "Susan" }
{ "_id" : ObjectId("52083706b251cc4be9813159"), "address" : "789 County Rd.", "name" : "Bill" }
{ "_id" : ObjectId("52083706b251cc4be981315a"), "address" : "789 County Rd.", "name" : "Nancy" }
{ "_id" : ObjectId("52083706b251cc4be981315b"), "address" : "789 County Rd.", "name" : "Adam" }


What I want to get is something like:
Code:

{ "_id" : ObjectId("52083706b251cc4be9813153"), "address" : "123 Main Ave.", "people" : [
            { "_id" : ObjectId("52083706b251cc4be9813154"), "name" : "Fred"} ,
          { "_id" : ObjectId("52083706b251cc4be9813155"), "name" : "Jane" } ,
          { "_id" : ObjectId("52083706b251cc4be9813155"), "name" : "Bob" }   
      ] 
},
{ "_id" : ObjectId("52083706b251cc4be9813156"), "address" : "345 Elm St.", "people" : [
          { "_id" : ObjectId("52083706b251cc4be9813157"), "name" : "Mary"} ,
          { "_id" : ObjectId("52083706b251cc4be9813158"), "name" : "John" } ,
          { "_id" : ObjectId("52083706b251cc4be9813159"), "name" : "Susan" }
      ] 
},
{ "_id" : ObjectId("52083706b251cc4be981315a"), "address" : "789 County Rd.", "people" : [
          { "_id" : ObjectId("52083706b251cc4be981315b"), "name" : "Mary"} ,
          { "_id" : ObjectId("52083706b251cc4be981315c"), "name" : "John" } ,
          { "_id" : ObjectId("52083706b251cc4be981315d"), "name" : "Susan" }
      ] 
}

I know why I am getting what I am getting, but can't seem to find anything online or in the examples to get me where I want to be.
I was hoping someone could nudge me in the right direction, point to an example that is closer to what I am trying to accomplish, or tell me that this is out of scope for what Kettle is supposed to do (Hopefully not the latter).

Thanks in advance for any help

Devin
PDI 4.4.0

"Get data from XML" issues with namespace

$
0
0
I've been working on this all day and can not get any data to come back. The XML file I have is large, but here is an example of the parts that matter...

Code:

<env:Envelope xmlns:env="http://schemas.xmlsoap.org/soap/envelope/" xmlns:wsa="http://www.w3.org/2005/08/addressing">
  <env:Header>
    <wsa:Action>http://xmlns.oracle.com/apps/cdm/foundation/parties/organizationService/applicationModule//OrganizationService/findOrganizationResponse</wsa:Action>
    <wsa:MessageID>urn:uuid:1ebd1c41-1072-40cc-bb68-8728e6de52eb</wsa:MessageID>
  </env:Header>
  <env:Body>
    <ns0:findOrganizationResponse xmlns:ns0="http://xmlns.oracle.com/apps/cdm/foundation/parties/organizationService/applicationModule/types/">
      <ns2:result xmlns:ns2="http://xmlns.oracle.com/apps/cdm/foundation/parties/organizationService/applicationModule/types/" xmlns:ns1="http://xmlns.oracle.com/apps/cdm/foundation/parties/organizationService/" xmlns:ns4="http://xmlns.oracle.com/apps/cdm/foundation/parties/contactPointService/" xmlns:ns3="http://xmlns.oracle.com/apps/cdm/foundation/parties/partyService/" xmlns:tns="http://xmlns.oracle.com/adf/svc/errors/" xmlns:ns0="http://xmlns.oracle.com/adf/svc/types/" xmlns:ns8="http://xmlns.oracle.com/apps/cdm/foundation/parties/relationshipService/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:type="ns1:OrganizationPartyResult">
        <ns1:Value>
          <ns1:PartyId>100000000011634</ns1:PartyId>
          <ns1:OriginalSystemReference xmlns:ns5="http://xmlns.oracle.com/apps/cdm/foundation/parties/flex/sourceSystemRef/">
            <ns3:OrigSystemReference>128016</ns3:OrigSystemReference>
          </ns1:OriginalSystemReference>
        </ns1:Value>
        <ns1:Value>
          <ns1:PartyId>100000000011784</ns1:PartyId>
          <ns1:OriginalSystemReference xmlns:ns5="http://xmlns.oracle.com/apps/cdm/foundation/parties/flex/sourceSystemRef/">
            <ns3:OrigSystemReference>127919</ns3:OrigSystemReference>
          </ns1:OriginalSystemReference>
        </ns1:Value>
        <ns1:Value>
          <ns1:PartyId>100000000011785</ns1:PartyId>
          <ns1:OriginalSystemReference xmlns:ns5="http://xmlns.oracle.com/apps/cdm/foundation/parties/flex/sourceSystemRef/">
            <ns3:OrigSystemReference>127920</ns3:OrigSystemReference>
          </ns1:OriginalSystemReference>
        </ns1:Value>
      </ns2:result>
    </ns0:findOrganizationResponse>
  </env:Body>
</env:Envelope>

I saved that exact code into a file and selected it as the only file in the "Selected files:" in my "Get data from XML" step and checked "Namespace aware?". When I click the "Get XPath nodes" button, the loop I want is an option, "/env:Envelope/env:Body/ns0:findOrganizationResponse/ns2:result/ns1:Value". However, when I go to the "Fields" tab and click "Get Fields", I get the following error, "Exception occurred evaluating XPath: /env:Envelope/env:Body/ns0:findOrganizationResponse/ns2:result/ns1:Value. Exception: Xpath expression uses unbound namespace prefix ns0".

If I back the loop up to not include the ns0 element and use "/env:Envelope/env:Body" as my XPath loop, my "Get fields" now works, populating my field table with...

Code:

Name                | XPath                                                                                                | ...
PartyId            | ns0:findOrganizationResponse/ns2:result/ns1:Value/ns1:PartyId                                        | ...
OrigSystemReference | ns0:findOrganizationResponse/ns2:result/ns1:Value/ns1:OriginalSystemReference/ns3:OrigSystemReference | ...
result              | ns0:findOrganizationResponse/ns2:result                                                              | ...

I don't need the "result" field, so I delete that. This should also give me the results I want, but when I run this, I get a blank record.

Resulting File:
Code:

PartyId,OrigSystemReference
,

What I want to get back is:
Code:

PartyId,OrigSystemReference
100000000011634,128016
100000000011784,127919
100000000011785,127920

At this point, ANY help would be greatly appreciated.
-Eric

SolutionRepositoryServiceProxy.ERROR_0001 in Administrator Console

$
0
0
Hi all.


When I create a new schedule, the follow message is displayed.
I can't create a new schedule.

---------------------------------------------------------
SolutionRepositoryServiceProxy.ERROR_0001
Unable to get content of solution repository. Element type "file" must be followed by either attribute specifications, ">" or "/>".
---------------------------------------------------------

Please help.

Convert date to string and get only year part from calender date in pentaho CDE

$
0
0
Very urgent please...

I have a parameter start_date(2013-01-05). I want to insert only 2013 in my parameter in MDX query . (i.e., I have to take only 2013 and I need to convert it to string in MDX).
Can some one please help me ?


Sadakar

Deleting old file and creating new file for Hive external Table

$
0
0
Hi all,

I have a flat file(TSV file) generated on daily basis for hadoop's Hive database external table.

I need to delete the old file which is being referred by Hive's external Table and create the new file for the same table with the same .tsv file name.

I dont know how to delete the file and create the new file on same Path.

I'm using ubuntu 12.0.4 OS and Pentaho DataIntergration CE 4.4 .
Please let me know is there a way around to automate this process as I'm doing this job manually.

I really appreciate your inputs.

Malibu.

Self joins in Metadata

$
0
0
Hi

I am just wondering if it is possible to create a business model for self joins, that I can later use with saiku-reporting.

Kaapa: from IRC says that you can't have self joins. But I can achieve that by creating a new business view for the same table

But if I do that, I will have duplicate views
For ex: If I have a table called "Student" and If I want to achieve self join on this. I need to have two views "Student" & "Student-Copy-For-selfjoins".

I understand this solves the problem but it appears ugly specially for end users.

Have you ever come across this situation before? If so, I am curious to know how you solved it.

Thanks for your time!

many inputs in a folder

$
0
0
Hello,
in the "Text file input" step I'm trying to set the input as a folder, where I have the input files. I want to transform all input files in a folder without changing the input manually.
All files are in the folder C:\Users\David\Desktop\Example\Vstup

In the "Text file input" step i set a path to C:\Users\David\Desktop\Example\Vstup\*.txt but it does not work.
Thanks

attribute reduction

$
0
0
Hi,
After performing one round of attribute selection with cfssubseteval and BFS, I need to reduce the no of attributes...
Instead of using cfssubseteval and BFS again for the second time, Can you plz tell me what are all the methods available in WEKA 3.6.8 to reduce the attributes further.

Thanks in advance..

Installation of Pentaho community vesrion

$
0
0
Hi

I had downloaded pentaho community version for windows..and also the jar files of CDF and CDE frameworks,
Can anyone plz tell me the procedure to install community version and then CDF and CDE..so that i can start my work with
Dashboard tools..

Problem connecting to OpenEdge 10.2B Database

$
0
0
I'm trying to connect to an OpenEdge 10.2B database using Spoon. Connection was succefull in SquirrelSQL so I have all the drivers and parameters. How can I make a new connection type? Where do I put the Openedge.jar file... how do I configure the connection... Many thanks

Report Validation Failed in Pentaho BI server 4.8 community

$
0
0
I have created prpt file in PRD 3.9.1 and giving MYSQL JDBC connection. When I trying to view report output through PRD , It shows, But in Pentaho BI server 4.8.0 community edition same prpt not running , it gives report validation failed error in html output and for XLS output it gives file not found error.

only PDF, text, rich text output is display correctly.

http://localhost:8080/pentaho/conten...nalWidth=false

Following is pentaho.log error :-

Code:

2013-08-12 16:48:28,515 ERROR [org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor] 22316606: Report processing failed.
2013-08-12 16:48:28,515 ERROR [org.pentaho.reporting.platform.plugin.SimpleReportingComponent] [execute] Component execution failed.
org.pentaho.reporting.engine.classic.core.ReportProcessingException: Failed to process the report
    at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processReport(AbstractReportProcessor.java:1690)
    at org.pentaho.reporting.platform.plugin.output.PageableHTMLOutput.generate(PageableHTMLOutput.java:191)
    at org.pentaho.reporting.platform.plugin.SimpleReportingComponent.execute(SimpleReportingComponent.java:1069)
    at org.pentaho.reporting.platform.plugin.ExecuteReportContentHandler.doExport(ExecuteReportContentHandler.java:232)
    at org.pentaho.reporting.platform.plugin.ExecuteReportContentHandler.createReportContent(ExecuteReportContentHandler.java:68)
    at org.pentaho.reporting.platform.plugin.ReportContentGenerator.createContent(ReportContentGenerator.java:67)
    at org.pentaho.platform.engine.services.solution.SimpleContentGenerator.createContent(SimpleContentGenerator.java:66)
    at org.pentaho.platform.web.servlet.GenericServlet.doGet(GenericServlet.java:261)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:617)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:142)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:84)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
    at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
    at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.SecurityStartupFilter.doFilter(SecurityStartupFilter.java:103)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:169)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.AbstractProcessingFilter.doFilterHttp(AbstractProcessingFilter.java:278)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.logout.LogoutFilter.doFilterHttp(LogoutFilter.java:89)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.HttpSessionReuseDetectionFilter.doFilter(HttpSessionReuseDetectionFilter.java:134)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
    at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:60)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:113)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
    at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:861)
    at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:579)
    at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1584)
    at java.lang.Thread.run(Thread.java:619)
ParentException:
org.pentaho.reporting.engine.classic.core.function.FunctionProcessingException: ReportStarted failed
    at org.pentaho.reporting.engine.classic.core.layout.output.DefaultOutputFunction.reportStarted(DefaultOutputFunction.java:178)
    at org.pentaho.reporting.engine.classic.core.states.InitialLayoutProcess.fireReportStartedEvent(InitialLayoutProcess.java:282)
    at org.pentaho.reporting.engine.classic.core.states.InitialLayoutProcess.fireReportEvent(InitialLayoutProcess.java:215)
    at org.pentaho.reporting.engine.classic.core.states.SubLayoutProcess.fireReportEvent(SubLayoutProcess.java:257)
    at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.fireReportEvent(ProcessState.java:1023)
    at org.pentaho.reporting.engine.classic.core.states.process.ReportHeaderHandler.advance(ReportHeaderHandler.java:47)
    at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.advance(ProcessState.java:818)
    at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processPaginationLevel(AbstractReportProcessor.java:735)
    at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.prepareReportProcessing(AbstractReportProcessor.java:541)
    at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processReport(AbstractReportProcessor.java:1647)
    at org.pentaho.reporting.platform.plugin.output.PageableHTMLOutput.generate(PageableHTMLOutput.java:191)
    at org.pentaho.reporting.platform.plugin.SimpleReportingComponent.execute(SimpleReportingComponent.java:1069)
    at org.pentaho.reporting.platform.plugin.ExecuteReportContentHandler.doExport(ExecuteReportContentHandler.java:232)
    at org.pentaho.reporting.platform.plugin.ExecuteReportContentHandler.createReportContent(ExecuteReportContentHandler.java:68)
    at org.pentaho.reporting.platform.plugin.ReportContentGenerator.createContent(ReportContentGenerator.java:67)
    at org.pentaho.platform.engine.services.solution.SimpleContentGenerator.createContent(SimpleContentGenerator.java:66)
    at org.pentaho.platform.web.servlet.GenericServlet.doGet(GenericServlet.java:261)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:617)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:142)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:84)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
    at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
    at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.SecurityStartupFilter.doFilter(SecurityStartupFilter.java:103)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:169)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.AbstractProcessingFilter.doFilterHttp(AbstractProcessingFilter.java:278)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.logout.LogoutFilter.doFilterHttp(LogoutFilter.java:89)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.HttpSessionReuseDetectionFilter.doFilter(HttpSessionReuseDetectionFilter.java:134)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
    at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:60)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:113)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
    at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:861)
    at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:579)
    at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1584)
    at java.lang.Thread.run(Thread.java:619)
ParentException:
java.lang.NullPointerException
    at java.util.Hashtable.put(Hashtable.java:394)
    at sun.font.PhysicalStrike.getGlyphPoint(PhysicalStrike.java:112)
    at sun.font.SunLayoutEngine.nativeLayout(Native Method)
    at sun.font.SunLayoutEngine.layout(SunLayoutEngine.java:133)
    at sun.font.GlyphLayout$EngineRecord.layout(GlyphLayout.java:648)
    at sun.font.GlyphLayout.layout(GlyphLayout.java:447)
    org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:579)
    at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1584)
    at java.lang.Thread.run(Thread.java:619)

Http header

$
0
0
Hi, is there any way to view the response header of an http call ?
I'll be more specific. I need to see when a resource (pointed by an URL on the web) has been modified. Knowing the date of the last modification i decide if download it or not. I think that a way to do that is watching the header of an http call. Any suggestions ?

How to specify XML output element names different than input field names

$
0
0
Hi,

This might be simple, but I'm new to Kettle. I'm working on a transformation of data from a database source to XML - I've got a small example doing this with a CSV file. So far so good, but the fields from the database table must be mapped to different names in my XML document. In the example, using a CSV file as input, it seems that my XML element names in the XML output is somehow limited/restricted to the names coming from the input source.

Im trying to figure out what is the preferred way to do this kind of transformation - I have an XSD that defines the XML structure and data types. I would prefer a solution where I can make use of the XSD or at least a pre-generated XML document for the XML output.

Example:

Input from database:
Field A, B, C, D.

XML output fields/structure:

<RootElement>
<Report>
<F1100>TheValueFromA</F1100>
<X2200>TheValueFromB</X2200>
<M2233>TheValueFromC</M2233>
<Z0123>TheValueFromD</Z0123>
</Report>
<Report>
<F1100>TheValueFromA</F1100>
<X2200>TheValueFromB</X2200>
<M2233>TheValueFromC</M2233>
<Z0123>TheValueFromD</Z0123>
</Report>
</RootElement>

I've read several posts here on the forum but I didn't find the answer.

Thanks.

Spoon - ERROR

$
0
0
Hi,
I'm trying to preview my transformation but getting error "Unable to get fields from previous steps because of an error" in all Input Table step of the transformation.
Please see the attachement.

Below error:

[2013/08/12 16:38:36 - C:\Users\U130330\Desktop\DIM_CLIENTES_20130812.ktr : DIM_CLIENTES_20130812 - Dispatching started for transformation [C:\Users\U130330\Desktop\DIM_CLIENTES_20130812.ktr : DIM_CLIENTES_20130812]
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Unexpected error
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : An error occurred executing SQL:
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : SELECT GESTOR.RESTRTOCOMPS, GESTOR.BNAME, GESTOR.CODGESTOR, GESTOR.NAME
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : FROM GESTOR;
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : [Microsoft][ODBC Microsoft Access Driver] The Microsoft Access database engine cannot find the input table or query 'GESTOR'. Make sure it exists and that its name is spelled correctly.
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.database.Database.openQuery(Database.java:1857)
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.steps.tableinput.TableInput.doQuery(TableInput.java:233)
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.steps.tableinput.TableInput.processRow(TableInput.java:143)
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.step.RunThread.run(RunThread.java:50)
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at java.lang.Thread.run(Thread.java:619)
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Caused by: java.sql.SQLException: [Microsoft][ODBC Microsoft Access Driver] The Microsoft Access database engine cannot find the input table or query 'GESTOR'. Make sure it exists and that its name is spelled correctly.
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at sun.jdbc.odbc.JdbcOdbc.createSQLException(JdbcOdbc.java:6957)
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at sun.jdbc.odbc.JdbcOdbc.standardError(JdbcOdbc.java:7114)
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at sun.jdbc.odbc.JdbcOdbc.SQLExecDirect(JdbcOdbc.java:3110)
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at sun.jdbc.odbc.JdbcOdbcStatement.execute(JdbcOdbcStatement.java:338)
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at sun.jdbc.odbc.JdbcOdbcStatement.executeQuery(JdbcOdbcStatement.java:253)
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.database.Database.openQuery(Database.java:1843)
2013/08/12 16:38:36 - Table input.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : ... 4 more
2013/08/12 16:38:36 - Table input.0 - Finished reading query, closing connection.
2013/08/12 16:38:36 - Table input.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1)
2013/08/12 16:38:36 - C:\Users\U130330\Desktop\DIM_CLIENTES_20130812.ktr : DIM_CLIENTES_20130812 - C:\Users\U130330\Desktop\DIM_CLIENTES_20130812.ktr : DIM_CLIENTES_20130812
2013/08/12 16:38:36 - C:\Users\U130330\Desktop\DIM_CLIENTES_20130812.ktr : DIM_CLIENTES_20130812 - C:\Users\U130330\Desktop\DIM_CLIENTES_20130812.ktr : DIM_CLIENTES_20130812]

Can someone tell me why of this error
where I am wrong

Thank you
Attached Files

interaction data with Protovis

$
0
0
Guys, I need help.


I have a Protovis Graphical, and precise interaction he receives another chart when i Click on a graph I need the categories of the Protovis Graphical getting selected.
I do not know what event I should implement in Protovis and how to receive this data.


Graphs in CCC I do so: (and how I do in Protovis)

var Gerente = scene.vars.category.label; // get selection

var varChart = Dashboards.getComponent("render_Distribuicao_fat"); // get objets (other chart)

// Remove all selects
var datumsAll = varChart.chart.data.datums();
datumsAll.each(function(datum) { datum.setSelected(false); });

// Define select
datums = varChart.chart.data.datums([{category: Gerente }]);
datums.each(function(datum) { datum.setSelected(true); });

varChart.chart.render(/*bypassAnimation*/ false, /*recreate*/true, /*reload*/false)

Publish report error

$
0
0
I'm trying to publish my report from Pentaho Report Designer and I'm receiving an error which is directing me to see the server logs. After looking at the pentaho.log file that is found in the tomcat logs directory under the biserver-ce directory it gives me the following message which I'm unable to decipher. I am new to PRD and would appreciate some direction. I was not sure which log file to investigate and assumed this was the correct one.

Report designer 3.8.2/Mac OSx 10.7.5

log file message - Pentaho.log (biserver-ce-4.8.0-stable/biserver-ce/tomcat/logs/pentaho.log)

2013-08-12 09:21:52,298 ERROR [org.pentaho.platform.web.hsqldb.HsqlDatabaseStarterBean] HsqlDatabaseStarterBean.ERROR_0006 - The default port of 9001 is already in use. Do you already have HSQLDB running in another process? The HSQLDB Starter cannot continue.
2013-08-12 09:21:52,299 WARN [org.pentaho.platform.web.hsqldb.HsqlDatabaseStarterBean] Due to a previous error, no databases will be available.
2013-08-12 09:21:56,952 WARN [org.pentaho.hadoop.shim.HadoopConfigurationLocator] Unable to load Hadoop Configuration from "file:///Users/username/Documents/Pentaho/biserver-ce-4.8.0-stable/biserver-ce/pentaho-solutions/system/kettle/plugins/pentaho-big-data-plugin/hadoop-configura2013-08-12 09:21:57,046 WARN [org.apache.axis2.descript2013-08-12 09:21:58,910 WARN [org.pentaho.reporting.libraries.base.boot.PackageManager] Unresolved dependency for package: org.pentaho.reporting.engine.classic.extensions.datasources.cda.CdaModule
2013-08-12 09:21:58,937 WARN [org.pentaho.reporting.libraries.base.boot.PackageSorter] A dependent module was not found in the list of known modules.
2013-08-12 09:22:03,828 WARN [org.apache.axis2.description.java2wsdl.DefaultSchemaGenerator] We don't support method overloading. Ignoring [public java.lang.String serializeModels(org.pentaho.metadata.model.Domain,java.lang.String,boolean) throws java.lang.Exception]

Cube : gestion des droits d'accès sur les données (role)

$
0
0
Bonjour,

Je souhaite filtrer l'accès aux données de mon cube en fonction du rôle auquel appertient mon utilisateur.

Pour cela j'ai mis en place des "roles" dans mon schéma XML sous Schema Worbench, en voici un exemple :

<Role name="DDT_40">
<SchemaGrant access="all">
<CubeGrant cube="Suivi activit&#233;" access="custom">
<HierarchyGrant hierarchy="H_UNITE_SAISIE" access="custom">
<MemberGrant member="[H_UNITE_SAISIE].[DDTM].[193]" access="all">
</MemberGrant>
</HierarchyGrant>
</CubeGrant>
</SchemaGrant>
</Role>

Lorsque je me connecte au portail pentaho avec l'utilisateur appartenant au role DDT_40, que j'ouvre mon cube sous Saiku Analytics et que je sélectionne les éléments de la hiérarchie H_UNITE_SAISIE alors je ne vois que les données DDTM et 193 : cela correspond bien à mon filtre.
exemple de résultats obtenus :
DDTM/193 en Janvier 2013 le chiffre d'affaire est de 123 M
DDTM/193 en Janvier 2012 le chiffre d'affaire est de 120 M

si j'enlève la dimension H_UNITE_SAISIE dans mon tableau alors la restriction sur la DDT est perdu :
Janvier 2013 le chiffre d'affaire est de 620 M
Janvier 2012 le chiffre d'affaire est de 600 M
janvier 2011 le chiffre d'affaire est de 580 M

Je devrais normalement voir :
Janvier 2013 le chiffre d'affaire est de 123 M
Janvier 2012 le chiffre d'affaire est de 120 M

Comment faire pour que la restriction sur les données soient conservés ?

Merci par avance pour vos réponses

Difficulties with MapReduce and Kettle

$
0
0
Hello,

I'm new with pentaho data integration and I have some difficulties to realize a threatment with MapReduce. So, I want to extract some lines in a file with a filter wich is contained in another file. The result of this query could be write in another file. Itried to do that : The map step is the query and the reduce step is the action of writing in the result file. but I'm not really sure if it is the good way to do that. Can someone tell me if I'm right ?

I'm not sure if I'm really clear, and sorry for my bad english.

Thanks for the help,

Thomas

org.pentaho.di.core.util.StreamLogger IOException

$
0
0
I have what I think is a logging issue running a job. At an execute a shell script step, I am getting:

- Starting entry [Curl - sumtraf]
INFO 12-08 13:14:44,888 - Curl - sumtraf - Found 0 previous result rows
INFO 12-08 13:14:44,888 - Curl - sumtraf - Running on platform : Linux
INFO 12-08 13:14:44,935 - Curl - sumtraf - Executing command : /opt/log/shoppertrak/kettle_af096400-0372-11e3-93b5-832ee142cbaeshell
INFO 12-08 13:14:44,975 - Curl - sumtraf - (stderr) % Total % Received % Xferd Average Speed Time Time Time Current
INFO 12-08 13:14:44,976 - Curl - sumtraf - (stderr) Dload Upload Total Spent Left Speed
INFO 12-08 13:14:44,976 - Curl - sumtraf - (stderr)
INFO 12-08 13:14:45,302 - Curl - sumtraf - (stderr) 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
INFO 12-08 13:14:45,315 - Curl - sumtraf - Command /opt/log/shoppertrak/kettle_af096400-0372-11e3-93b5-832ee142cbaeshell has finished
ERROR 12-08 13:14:45,317 - Curl - sumtraf - (stderr) java.io.IOException: Stream closed
at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:283)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:325)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:177)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:154)
at java.io.BufferedReader.readLine(BufferedReader.java:317)
at java.io.BufferedReader.readLine(BufferedReader.java:382)
at org.pentaho.di.core.util.StreamLogger.run(StreamLogger.java:57)
at java.lang.Thread.run(Thread.java:722)




but this only happens sometimes. I can run it 6 times in a row, 4 times it works, twice it fails.

I have specify logfile and append logfile checked and row level logging all set in that step.
Any ideas?

pan freezes without explanation

$
0
0
Hi guys, i'm running a transformation with pan with mysql and -Xmx4096m and it freezes right after some constants been added.
It was working fine and now it freezes always on the same point only if i try to work with all database data which is about 8 million rows.

I've checked other threads on forum about freezing and transformations without response and i didn't find any answer that work with me.


Check attached transformation.
Attached Images
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>