February 25, 2016, 9:03 am
What is the best way to add the query results to an input stream row by row?
For example:
in the input stream we have the fields (name, city).
We also have a db table with tuples (id, city, country).
I would to add (id, country) obtained with a query on the table to the input stream fields.
Thanks!
mauro
eg.jpg
↧
February 25, 2016, 11:09 am
↧
↧
February 25, 2016, 11:51 am
Hello Everyone,
I am a newbie in Pentaho and I am trying to play with some data transformations.
I have files by the names: Session-ABC-01-12-2015 Session-XYZ-01-12-2015_RawData.csv. I was trying to extract just the date as a field after I load the csv file. So once I load the file, I want the existing fields from the file and an additional field that has the date from the file name.
I tried Get File Names and Split Fields, but I am not able to get it right. I read that one way is to use JavaScript to get this done. Is there a way to get it done just using Spoon functionality?
Thanks for you time.
↧
February 25, 2016, 1:41 pm
Hello,
I would like to know how it is the situation about shared mondrian cache.
I have read this article:
http://pedroalves-bi.blogspot.com.es...n-pentaho.html
But I think it's outdated, since some things have changed (for example datasources.xml doesn't exist).
Thank you.
↧
February 25, 2016, 3:41 pm
Has anyone had success using HTTP POST step with updateWorklog from the Tempo Timesheets servlet? I am able to successfully use getWorklog from the servlet. An example of the HTTP POST settings and corresponding XML input file would be very helpful.
↧
↧
February 25, 2016, 10:13 pm
Hello,
When I use a group, the order of the query isn't taken into account, and these are sorted alphabetically . Is it possible to get the query order?
Thank you.
↧
February 26, 2016, 12:04 am
I know mondrian converts MDX queries into relational queries and return the result. But is there any details about this process?
I use the sample cube HR. Here is the MDX:
WITH
SET [~ROWS] AS
TopCount({[Time].[Time].[Month].Members}, 3, [Measures].[Org Salary])
SELECT
NON EMPTY {[Measures].[Org Salary]} ON COLUMNS,
NON EMPTY [~ROWS] ON ROWS
FROM [HR]
And this is the SQL generated by the MDX. I find it in log:
select
"time_by_day"."the_year" as "c0",
"time_by_day"."quarter" as "c1",
"time_by_day"."the_month" as "c2",
"time_by_day"."month_of_year" as "c3",
sum("salary"."salary_paid") as "c4"
from
"salary" as "salary",
"time_by_day" as "time_by_day"
where
"time_by_day"."the_year" = 1997
and
"salary"."pay_date" = "time_by_day"."the_date"
group by
"time_by_day"."the_year",
"time_by_day"."quarter",
"time_by_day"."the_month",
"time_by_day"."month_of_year"
order by
CASE WHEN sum("salary"."salary_paid") IS NULL THEN 1 ELSE 0 END, sum("salary"."salary_paid") DESC,
CASE WHEN "time_by_day"."the_year" IS NULL THEN 1 ELSE 0 END, "time_by_day"."the_year" ASC,
CASE WHEN "time_by_day"."quarter" IS NULL THEN 1 ELSE 0 END, "time_by_day"."quarter" ASC,
CASE WHEN "time_by_day"."the_month" IS NULL THEN 1 ELSE 0 END, "time_by_day"."the_month" ASC,
CASE WHEN "time_by_day"."month_of_year" IS NULL THEN 1 ELSE 0 END, "time_by_day"."month_of_year" ASC
I change top 3 to top 10 and I got the same SQL. And the SQL has nothing like "limit". But it just returned 3 or 10 items. It works, but how?
So I am wondering what happened during a query? I searched and didn't get any useful information. Can anybody help?
Thank you.
Longxing
↧
February 26, 2016, 5:04 am
Hello,
Do anyone know where (in which file/s) the passwords in Pentaho are stored as a clear text, any passwords?
as far as i know the JNDI connection passwords are stored (as a clear text) in the context.xml file (as in the attached file), is the any other file/s, in which any passwords are stored as clear text?
is there any way to encrypt these passwords?
if no, is there any solution? I want to avoid having any kind of passwords in clear text?
how can i change these user's passwords (hibuser, pentaho_user, Jcr_user)
JNDI pass Postgre.jpg
Thanks in advanced
↧
February 26, 2016, 5:14 am
Hello All,
I would like to know if I can give weight to different classifiers to combine them with logistic regression in weka. Is it possible?
Thanks in advance.
↧
↧
February 26, 2016, 7:50 am
Here is what I am interested in doing:
1. I have a set of columns from input source.
2. I have another set of columns from another input source.
I want to take values from 1 and map them to values in 2.
I am trying to do this without manually typing everything.
Is this possible? Can you give me advice on where to start?
↧
February 26, 2016, 9:50 am
We've been making some changes to the Ctools projects, and gradually migrating them to maven. It's a WIP, but you can see that, eg,
CDF and
CDA already have that structure.
Unlike ant, where things just work when you type that command line, maven requires some extra settings. Our dev team very recently documented all that's needed to do, so all is now on the README files
Here's a screenshot from the CDF
README.me:
More...
↧
February 26, 2016, 10:11 am
Hi all,
I recently downloaded Pentaho PDI CE 6.0.1, and seem to be unable to run Spoon at all. The error I get is:
Could not find the main class: org.pentaho.commons.launcher.Launcher. Program will exit.
I am running Windows 7, though I also intend to run Pentaho data integration on Linux as well.
I am aware that this issue is similar to what is found here:
http://forums.pentaho.com/showthread...the-main-class
However, this is a bit strange as the old version I have (apparently 4.2.0?) seems to run spoon.bat on Windows without that error.
I do not have a JAVA_HOME set in the environment variables, though I did not need one before. I have a JRE in c:\Program Files\Java\jre6, and I have Oracle installed as well.
I extracted Pentaho to C:\dev\pdi-ce-6.0.1.0-386; it has a directory within it called "Data Integration".
If anyone is able to help, it is appreciated.
↧
February 26, 2016, 11:33 am
I have one transformation using ETL Metadata Injection which executes a separate Access Input transformation. I've confirmed that the Access Input step reads the correct number of rows for each table. However, since the Get Fields option on the Access Input step is not done for each table, no field names or data is output to the stream. The Access database contains many tables with different layouts, so my question is what do I need to do to either the ETL Metadata Injection or Access Input steps to get the data. Attached are the transformations, however I could not attach the test database, but any DB should work.
TIA
Windows 7
PDI 5.3
↧
↧
February 26, 2016, 9:55 pm
Hi,
Can anyone guide me how can i check whether Penthao BI sevices is running or not on Linux through command tool.
Thanks
↧
February 27, 2016, 4:39 am
Hi All,
I am new to pentaho.
My data source is csv file and i have negligible knowledge of mdx queries.
Can we use sql queries to access csv data source in Pentaho cde??
if yes, how?
Thank you
Preeti Agrawal
↧
February 27, 2016, 3:27 pm
Is there some relatively simple approach to apply database transactional logic into a job entry in Community Edition?
I've thought of transformation executors within transformation with blocking steps, but for a quite complex processing consisting of many transformations that need to share a db connetion that seems a bit odd.
↧
February 27, 2016, 6:22 pm
Hi all,
I'm new here and I begin to use Pentaho:) , so I need you please...
My problem is to fix colors for each category in TreeMap chart.
For eg:
I have 2 categories of car : BMW and VW and for each category I have many types of car;
BMW: Serie1, Serie2....
VW: Golf, Jetta, Passat....
So, I would like to fix colors for this two categories: BMW and VW (all BMW in blue and all VW in red).
I used many things but nothing :(, I can't fix the colors!
I use also in Pre Execution, in Post Excution, in Post Fetch, in Pre Change, Post Change:
function f() { // Series Value => Fixed Color
this.chartDefinition.colorMap = {
"BMW": 'blue',
"VW": 'red'
};
}
But nothing
Thank you for help!!
↧
↧
February 28, 2016, 12:15 am
Hi,
I am using saiku anakytics for adhoc analysis. When i add virtual cube to my schema from workbench it published successfully but in saiku analytics my whole schema disappears. Can some one guide me how can i see virtual cubes on saiku analytics.
I am using pentaho 6.0.
I read somewhere that virtual cubes are no longer available in mondrian 4.0.
Can someone guide me on this.
Thanks
↧
February 28, 2016, 12:41 pm
Hello,
My project is about creating user interfaces with Pentaho, I would like to know about Pentaho products that are going to be useful for my project? and what are their limits?
Thank u :)
↧
February 28, 2016, 2:04 pm
I have Pentaho 5.4 and I am trying to use my own custom Authentication Manager. I have replaced authentication manager in the applicationContext-spring-security.xml with my own like this:
Code:
<!-- ======================== AUTHENTICATION ======================= -->
<jee:jndi-lookup id="oracleDS" jndi-name="jdbc/Sso" resource-ref="true" />
<bean id="myAuthenticator" class="com.my.pentaho.Authentication.MyAuthenticator">
<property name="dataSource">
<ref local="oracleDS" />
</property>
</bean>
<bean id="authenticationManager" class="org.springframework.security.providers.ProviderManager">
<property name="providers">
<list>
<pen:bean class="org.springframework.security.providers.AuthenticationProvider"/>
<ref local="myAuthenticator" />
</list>
</property>
</bean>
And I can log in fine, however, the main screen in not showing up, and I am getting this error in the log:
Code:
ERROR [BackingRepositoryLifecycleManagerAuthenticationSuccessListener] Repository access exception; nested exception is javax.jcr.security.AccessControlException: Principal Michael-/pentaho/tenant0 does not exist.
org.springframework.extensions.jcr.JcrSystemException: Repository access exception; nested exception is javax.jcr.security.AccessControlException: Principal Michael-/pentaho/tenant0 does not exist.
There also seems to be some Jackrabbit errors as well, which I don't understand as I am not using Jackrabbit, JDBC, or LDAP, but my own.
Any ideas? I also can't seem to find a tutorial (newer one 5+) on how to use a custom login. Not a custom JDBC, or custom LDAP, but just my own. I am well versed in Spring Security, but it seems Pentaho is doing something else.
↧