November 1, 2014, 11:09 pm
org.pentaho.di.core.exception.KettleException:
java.util.concurrent.ExecutionException: com.pentaho.commons.dsc.f: license missing, invalid, or expired
com.pentaho.commons.dsc.f: license missing, invalid, or expired
at org.pentaho.di.repository.pur.PurRepository.connect(SourceFile:409)
at org.pentaho.di.ui.repository.RepositoriesHelper.loginToRepository(RepositoriesHelper.java:249)
at org.pentaho.di.ui.repository.controllers.RepositoriesController$3.run(RepositoriesController.java:216)
at java.lang.Thread.run(Unknown Source)
Caused by: java.util.concurrent.ExecutionException: com.pentaho.commons.dsc.f: license missing, invalid, or expired
at java.util.concurrent.FutureTask$Sync.innerGet(Unknown Source)
at java.util.concurrent.FutureTask.get(Unknown Source)
at org.pentaho.di.repository.pur.PurRepository.connect(SourceFile:380)
... 3 more
Caused by: com.pentaho.commons.dsc.f: license missing, invalid, or expired
at com.pentaho.commons.dsc.j.a(SourceFile:70)
at org.pentaho.di.repository.pur.PurRepositoryMeta.getRepositoryCapabilities(SourceFile:79)
at org.pentaho.di.repository.BaseRepositorySecurityProvider.<init>(BaseRepositorySecurityProvider.java:38)
at org.pentaho.di.repository.pur.r.<init>(SourceFile:27)
at org.pentaho.di.repository.pur.b.<init>(SourceFile:22)
at org.pentaho.di.repository.pur.PurRepository$1.a(SourceFile:309)
at org.pentaho.di.repository.pur.PurRepository$1.call(SourceFile:302)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
... 1 more
↧
November 1, 2014, 9:21 am
Hello:
am investigating whether one could use PDI to extract Asterisk CDR information, and then based on the dialled number, look up its rate from another table and produce a value eg. (seconds * rate) + connect charge.
Has anybody done something similar ?
Thanks.
↧
↧
November 2, 2014, 8:30 am
Hello,
I am looking for ideas to effectively design our Collection Dashboard.
As opposed to Revenues or Expenses, Debts and Collection (as well as Inventory) are not simply "additive"....
There are always "Debts in Process" and the numbers change every day.
So I cannot just add all the debts (as I can with revenues), but I rather have to show the changes in debts...
I am currently in the middle of brainstorming on how to effectively approach this topic.
And I will appreciate very much any design ideas / best practicies articles / links to examples, etc.
I'll shortly describe our process (I work at a telecom and we use post-paid billing systems):
When a customer doesn't pay an invoice at a due-day - then this customer "Enters a collection Path".
There are several Collection Paths and a customer always enters one of them (depending on a reason) and always stays on that same Path - till he/she pays off the debt and exits the Path or till he/she is forwarded to an external lawer (see below)...
Once entered a collection Path, a customer "advances" through that Path's steps.
Path has a certain number of Steps (Debt reminder SMS, Reps phonecall, Supervisor phonecall, etc.). Each Path has different number of steps.
There are certain number of days that have to pass in order that a customer "advances" to another step (This "Time between steps" is defined for each Path and each Step. Most of the customers advance through the steps automatically by the definition, but there are deviations (there could be customers that "stuck" in a step for technical/business reasons, there are customers that the collection representatives decided to "put them back" several steps, etc.)
That's it.
As I mentioned, a customer can either pay off the debt and thus he/she exits the collection process.
Or, he/she doesn't pay and "advances" through all the Path's Steps till the last one ("External Lawyer")
My task is to build Effective Dashboard that shows to the managment (we are interested in seeing both money and customer count measures):
1. The daily advancement of the customers through the Steps (for all the Paths and for each one of the Paths separately)
2. Overall overview of "Work in Process"
3. Benchmarks - (Comparison with previous month for instance)
4. Alert reports - when something "goes wrong" (for instance: too many customers "stuck" in a step, etc....)
If any one of you has dealt with this kind of problem (it is similar to factory / inventory advamcement....),
I will very much appreciate any visual ideas
Thank you in advance,
MIchael
↧
November 2, 2014, 1:51 pm
I've upgraded to 3.6.7 and I've turned on the use of aggregation tables using the following parameters:
mondrian.rolap.aggregates.Use=true
mondrian.rolap.aggregates.Read=true
But I'm still having problems with mondrian using my aggregation tables because of the way it matches up table names and names used in the olap schema. My app must work on Mysql and Oracle DB which presents many challenges on its own. So to write a portable schema for mondrian I used all caps for table and column names because that was explained in the mondrian FAQ. Oracle converts all non-quoted table/column names to upper case. However, just the opposite happens in mysql. MySql converts table names to lower case depending on the settings specified in the my.cnf file.
Technically it might work (using all caps) but I keep seeing this in the logs:
Code:
2014-11-02 14:56:27,070 [RMI TCP Connection(2)-127.0.0.1] WARN mondrian.rolap.aggmatcher.AggTableManager - : No Table found for fact name=JOB_OPENINGS
2014-11-02 14:56:27,070 [RMI TCP Connection(2)-127.0.0.1] WARN mondrian.rolap.aggmatcher.AggTableManager - : No Table found for fact name=PAYCHECK_LINE_ITEMS
2014-11-02 14:56:27,070 [RMI TCP Connection(2)-127.0.0.1] WARN mondrian.rolap.aggmatcher.AggTableManager - : No Table found for fact name=PAYCHECKS
2014-11-02 14:56:27,070 [RMI TCP Connection(2)-127.0.0.1] WARN mondrian.rolap.aggmatcher.AggTableManager - : No Table found for fact name=EMPLOYMENT
2014-11-02 14:56:27,070 [RMI TCP Connection(2)-127.0.0.1] WARN mondrian.rolap.aggmatcher.AggTableManager - : No Table found for fact name=APPLICANTS
2014-11-02 14:56:27,070 [RMI TCP Connection(2)-127.0.0.1] WARN mondrian.rolap.aggmatcher.AggTableManager - : No Table found for fact name=INCOME_STATEMENTS
I found in Mondrian source code where this code is being printed out. If you see this it means Mondrian can't find the table that goes with the RolapStar so if that is printed Mondrian won't attempt to find any aggregation tables. It fails immediately. The reason it can't find the table name is the fact that in my olap schema I used all caps, but when mysql return table names its lower case. And in JdbcSchema.addTable() it puts whatever the database says the table name is into a HashMap which is case sensitive. Therefore it'll never work without everyone being case sensitive. I used my debugger to add an all caps version of the table name to the HashMap and once I do that it finds the aggregation tables. Other than that there appears to be no other option. I think this is a bug since it forces you to use case sensitive settings when Mondrian documentation seems to leave you with the impression that is not necessary.
I could fix this by subclassing JdbcSchema, but that is also impossible because of package protected constructors and private methods needed to reproduce the code that exists in addTable().
So what are my options? Are there settings in Mysql that I can use that would make mondrian portable between each database? I know I can use quotations to try and enforce case sensitive names, but mysql and oracle differ on the quotations. And that is a major rework in my program to go back through and redo every query I used to include quotations.
↧
November 2, 2014, 9:35 pm
Hi
This is srinivas
I am facing problem when extracting the csv excel file to pentaho report designer
please give me procedure to extract csv file to pentaho report designer
Thanks,
Srinivas
↧
↧
November 2, 2014, 10:21 pm
Hi
I am using pentaho 5.0.7 version .Have created some jobs in data integration server.I need to run that job using an URL which should pass the username/password as parameters.
Actually am trying to run that job from my dashboard.on click of particular job icon i need to run job which is there in data integration server.
Can anyone help me on this.
Thanks
kavitha S
↧
November 3, 2014, 12:26 am
Need suggestion regarding doing zoom in and zoom out in output type excel in PRD.
Thanks
↧
November 3, 2014, 1:22 am
I am using a multi-attribute dataset for classification purpose. I am using WEKA API on java.The dataset have both categorical and numerical variables. When i run the dataset on weka-GUI i get a better result with 16 leaves in 26 sized tree. But when i do the same using java code i only get 3 leaves in 5 sized tree . Here is my java code
Code:
public static Evaluation classify(Classifier model,
Instances trainingSet, Instances testingSet) throws Exception {
//return the classification model after training with train set and test with test set
Evaluation evaluation = new Evaluation(trainingSet);
model.buildClassifier(trainingSet);
evaluation.evaluateModel(model, testingSet);
//System.out.println(model);
return evaluation;
}
Classifier models = new J48(); // a decision tree
models.setOptions(optionsj);
FastVector predictions = new FastVector();
// For each training-testing split pair, train and test the classifier
for (int i = 0; i < trainingSplits.length; i++) {
Evaluation validation = classify(models, trainingSplits[i], testingSplits[i]);
predictions.appendElements(validation.predictions());
System.out.println(validation.toSummaryString("\nResults\n======\n", false));
}
System.out.println(models.toString());
How to make sure the j-48 take all the attributes in the dataset? what i did wrong?
thanks in advance
↧
November 3, 2014, 3:43 am
hello forum,
i would like to copy the content of the field "Bestätigter Termin" into the field "Wunschtermin" when the field "Wunschtermin" is empty.
When the field "Wunschtermin" is not empty, no copy-action should be run in then transformation.
Is it possible ? and how ?
best regards,
Bernett22
↧
↧
November 3, 2014, 4:31 am
Hi Forum,
I need some one help who can guide me How i can passed the Master batch_id into sub jobs and transformation.i have one master job inside that job i am calling sub jobs and transformation ..i need to populate the same id across all the transformation and jobs.
i have tried use simple method like set variable and get variable but not success so far..if any one guide me that really help me to solve my issue..
Sam
↧
November 3, 2014, 6:05 am
Hi,
I am new for pentaho, i am trying to add search option for individual column in table view but it is not working. I have added datatables.columnfilter.js and i have added below code in Post Execution but not worked.
$(document).ready(function(){
$('#emp_detail_clmnTable').dataTable()
.columnFilter({
aoColumns: [ { type: "text" },
{ type: "text" },
{ type: "text" },
{ type: "text" },
{ type: "text" },
{ type: "text" },
{ type: "text" },
{ type: "text" },
{ type: "text" },
{ type: "text" },
{ type: "text" }
]
});
});
and i have tried
var table = $('#emp_detail_clmnTable').DataTable();
$("#emp_detail_clmnTable tfoot th").each( function ( i ) {
var select = $('<select><option value="">All</option></select>')
.appendTo( $(this).empty() )
.on( 'change', function () {
var term = $(this).val()!='' ? '^'+$(this).val()+'$' : '';
table.column( i )
.search(term, true, false )
.draw();
} );
table.column( i ).data().unique().sort().each( function ( d, j ) {
select.append( '<option value="'+d+'">'+d+'</option>' )
});
});
nothing is working for me and have no errors in browser console. Some one please guide me how to come out add individual search option for column. Thanks in Advance.
↧
November 3, 2014, 8:03 am
Hello,
I have a table :
ColA ColB TrendArrow
row1 12
row2 Not defined
row3 -10
I have some problem with division by 0 so I would like to make a test on the ColB in order to Hide the trendArrow.
How is it possible to have in a trendArrow column some rows with a trend and some without ?
Thank you for your response.
↧
November 3, 2014, 8:49 am
I have created a job that transfers a bunch of blob type files from one database to another database.. The job runs fine with no issues except for one item. After the job is completed it leaves a bunch of jtdsxxxxxxxxx.tmp files in my /tmp dir
over time (a few hours) I have 5 gig of tmp files that are not cleaned up. While I could add at the end of the job to remove all files similar rm /tmp/jtds*
I would prefer to figure out how the app should be doing it.
Any suggestions where to start looking?
Thanks
↧
↧
November 3, 2014, 2:22 pm
Hi,
I'm having a devil of a time trying to read a tab separated file that's recently changed format, seemingly. The only thing that changed is the addition of double quotes. Removing those quotes in notepad makes everything work. No matter which encoding i use, Kettle is not able to read it and most of the time finds only a single field, instead of twenty something fields that are actually in the file. Any ideas would be very highly appreciated...
Screen Shot 2014-11-03 at 5.18.01 PM.png
Thanks!
↧
November 3, 2014, 6:10 pm
Hi all,
I'm writing rows of data to a text file using the Text File Output step.
One of the fields is an address string which contains <br /> when ever the original submission includes a line feed.
The problem is that Pentaho seems to be adding a CR+LF in the middle of the line and then the rest of the address and any subsequent fields appear on the next line.
I can't find any way to change this behaviour in the step options and our client needs the <br /> characters to stay in the address.
Any assistance will be appreciated!
Thanks,
Michael
↧
November 3, 2014, 7:57 pm
I have data coming from Mongo db and sql table which I merge using the columns which are common between table and json .
merged data set
uid |
id |
Json |
45 |
1 |
{
"key": "1/0/234",
"t1":{
"a":10,
"b": "test1"
}
} |
46 |
2 |
{
"key": "1/0/234",
"t1":{
"a":10,
"b": "test1"
}
} |
47 |
3 |
{
"key": "1/0/234",
"t1":{
"a":10,
"b": "test1"
}
}
After this I would like to insert the data in mongodb by generating new product key (which wil have 1 field value from json and 2 field values from table ?
How should I do it ?
when I attempted doing it I got the following
Actual o/p:{
"key" : "<string val>",
"json" : "<JSON sub document>"
}
expected o/p:
{
"key" : "<string val>",
"element" : "<JSON sub document>",
}
|
↧
November 3, 2014, 9:17 pm
i have created the procedure
call test(${word},${param});
when i select one word from the drop down list my charts are rendering. But when i select two word i am getting the error processing component error.
Please help me in this.
Advance Thanks:cool:
↧
↧
November 4, 2014, 2:26 am
Dear all,
1. I'm running pentaho BI/CE 5.11 on linux.
2. I have created a dashboard and saved it successfully.
3. When I try to preview the dashboard a blank tab appears with the text: Could not load dashboard: null
The same happens when I try to open the file via "Browse files"
4. If I open the file for editing it opens and all the configurations appear as i saved them.
See the logfile from pentaho.log below.
The CDF/CDA/CDE plugins are uptodate when looking at the marketplace.
I also tried to remove .orient folder and restart the server.
Any clue how to get this working?
Thanks
Bernhard
Code:
2014-11-04 11:05:26,545 ERROR [pt.webdetails.cdf.dd.api.RenderApi] Could not load dashboard: null
java.lang.NullPointerException
at java.util.regex.Matcher.quoteReplacement(Matcher.java:655)
at pt.webdetails.cdf.dd.model.inst.writer.cdfrunjs.dashboard.CdfRunJsDashboardWriteResult.render(CdfRunJsDashboardWriteResult.java:85)
at pt.webdetails.cdf.dd.api.RenderApi.render(RenderApi.java:148)
at pt.webdetails.cdf.dd.DashboardDesignerContentGenerator.createContent(DashboardDesignerContentGenerator.java:97)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutput.generateContent(GeneratorStreamingOutput.java:236)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutput.write(GeneratorStreamingOutput.java:163)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutputProvider.writeTo(GeneratorStreamingOutputProvider.java:54)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutputProvider.writeTo(GeneratorStreamingOutputProvider.java:33)
at com.sun.jersey.spi.container.ContainerResponse.write(ContainerResponse.java:306)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1479)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:108)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:113)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:185)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:87)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:115)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:263)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:55)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:114)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:70)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoPathDecodingFilter.doFilter(PentahoPathDecodingFilter.java:34)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:879)
at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:617)
at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1774)
at java.lang.Thread.run(Thread.java:745)
↧
November 4, 2014, 5:40 am
I created the Stored procedure like:
DROP PROCEDURE IF EXISTS dish_data.Sp_Test;
CREATE PROCEDURE dish_data.`Sp_Test`(IN pHappinessCategory VARCHAR(50))
BEGIN
SELECT CustomCategoryName AS 'Custom Category Name' ,COUNT(DISTINCT Ds.Call_ID) AS '# Calls' ,
AVG(HappinessScore) AS 'Avg Happiness Score' FROM dish_custom_topics DT
INNER JOIN dish_data_source ds ON dt.filename_id=ds.filename_id
INNER JOIN Dish_Phrase dp ON Dp.filename_id=ds.filename_id
WHERE
(ds.HappinessCategory_id IN (pHappinessCategory) OR 0 IN (pHappinessCategory))
GROUP BY CustomCategoryName ORDER BY 2 LIMIT 25;
END;
Here when i pass the single value it works but when we pass the multi values, charts are displaying error message.
Please help me in this.;)
↧
November 4, 2014, 7:03 am
Good evenig everybody,
I'm tryng to migrate a Dashboard (CDE) to an existing Tomcat. I found some guides on google, but nothing that actually work...
Did anybody made it before? Is there a blog or similar that can help?
↧