Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

PDI-5252 : Text File Output does not create header when append option is checked.

$
0
0
Can somebody tell me whether the issue http://jira.pentaho.com/browse/PDI-5252 is fixed and in which version ?
I tested it in the 5.1.0 and the 5.2.0 version.
But with append and header checked, I don't get an header at the creation of the outputFile.


Or does somebody know some work-around ?

Time Series Forecasting - Using overlay data in Java code

$
0
0
Hi,

I've managed to get the Weka Explorer to produce a forecast for my set of data, using two sets of overlay data (all in the same csv file) by evaluating on heldout data.

However when I try and implement this in Java, using the example on the Time Series webpage, I get the error: [Weka Forecaster] was trained with overlay data but none has been supplied for making a forecast!

I'm not sure how to set the holdout data set in my code?

Also, how would I go about comparing these forecasted values to the new recorded values and adjusting the forecast accordingly? Or would this not be possible?

Thanks!

AddIn databar: how to for gradient color

$
0
0
Hi,
for my table component this is the piece of code in pre-execution:
Code:

    //bars      this.setAddInOptions("colType","dataBar",function(column3){
      return {
                includeValue:true,
                widthRatio: 0.8,
                startColor: "orange",
                endColor: "green",
                valueFormat: function f(v) { return sprintf('%.0f', v); }
              };
    });

How can I do to obtain a gradient color bar like this?

gradientbar.png

Thanks!
Attached Images

Pentaho 5.3 is available! And so is CE!

$
0
0
See? I wasn't lying when I said in my last post that we wouldn't stop releasing our CE edition! :)

Without further delay, you can access the releases from the usual place. See the download section of community.pentaho.com


New ctools sample in 5.3


Like I mentioned in the announcement for 5.3 RC, Once again, there was a huge amount of work and bug fixing. Here are some links to some of the projects' changelog (there's much more though, as we have the code split in different modules):




But overall, here's a snippet of the changes. This is the overall thing, so there's EE stuff here as well:


  • SDR phase II (EE)
    • Using an existing model for publish
    • Using user-defined annotations to assist auto-model generation
    • Validated to work with Amazon Redshift and Impala
    • Datasource Security Phase I

  • Support for CDH 5.2 and MapR 4.0.1 (CE)
  • PDI Related Items
    • Named Clusters (CE)
    • Carte Clustering (CE)
    • Vizor as a package (EE)
    • PDI SDK as official release package (CE)

  • BA Server / Plugin Improvements
    • Data Access - 0 bad builds during ALL 5.3.0 RC builds (CE)
    • Analyzer JS API and Documentation (EE)
    • PIR Improvements (EE)


      • Design & Runtime row-limits and ability to schedule when hit


      • Toolbar button when embedded


      • Performance Improvements

  • Documentation (All)
    • PDI SDK documentation
    • Analyzer API documentation
    • Carte Clustering documentation
    • MT Documentation

  • Over 300 Closed and Validated jiras! (Majority common code)


Have fun!!

More...

problema con kettle e tesi di laurea

$
0
0
salve a tutti ragazzi, sono uno studente dell'Università di Palermo al quale purtroppo hanno associato una tesi bestiale con l'uso di Penthao. Devo realizzare un datawarehouse di reti biologiche con database scaricati da vari portali.

E sto incontrando dei problemi...


Attualmente il mio problema è dato dall'avere più elementi divisi da | in una colonna di una tabella. Di questi elementi me ne serve soltanto uno, quindi con quale oggetto di kettle posso prenderlo? o comunque eliminare una parte fino a un carattere indicato?

grazie a tutti...

Mondrian Cube's hierarchy is showing duplicate values

$
0
0
First of all, I know there's a similar post just below this one, but I think both problems are different.

I have a location dimension table with: PersonID | Region ID | RegionName | ProvinceName | ComunaName | CityName (Forget about the names, they aren't important). As you can probably guess, this table shows the location of every person in my database.

In my mondrian cube, PersonID is the primary key of the hierarchy and it connects with the fact table's PersonID column. So far so good, right? Now I wanna build the hierarchy "RegionName -> ProvinceName -> ComunaName -> CityName", but everytime I run an analysis on Saiku, it shows duplicates in every level but the top one (Region).

For example, it should show 296 cities, but instead it shows like 700. It's really annoying and I'm running out of ideas here.

I'm gonna attach 4 images showing what I've done.
Hope somebody can help me.

Image of the table itself and it's content (CodAux is the personID, the rest is only logical).
Duda_Tabla_DimLugar.png

My hierarchy properties in Schema Workbench.
Duda_jerarquía.jpg

My City level properties, as an example,
Duda_ciudad.jpg

The results in saiku, showing some of the duplicates and the 700 total results.
Duda_saiku.jpg

Thanks in advance.
Attached Images

I really need help with this (variables)

$
0
0
I have been all over the place on this, and I can't put it all together based on what I am seeing.

Very simply, I need to create a transformation that will do a number of things with a variable I want to pass as a parameter.

I want to use this parameter in a Execute SQL Query step, and it would be used more than once there.

I cannot find the proper place to really set it it seems, and how to set the query to use this variable when I set it during launching the transformation.

Its really just passing a string for an account number like '0000000001'. So I can say, DELETE FROM ACCOUNTS WHERE ACCT_NO = ${AccountNo} from a Oracle database.

Is it possible for someone to throw a simple kettle together as an example? I know there wont be a connection to a db I can configure that. Just can't understand how the variable is set and used in the transformation.

Thanks ahead of time for your help.

Java.lang.NullPointerException in a transformation

$
0
0
Hello All!

I am running a transformation, inputting a CSV file , changing a value in its field and outputting it to a excel file. CSV file has upwards of 30million records. When i preview the final step, I can see 100,1000 or even 1 million records, but when i run the ktr using the run button, it gives the following errors :

2015/02/18 15:07:28 - Spoon - Transformation opened.
2015/02/18 15:07:28 - Spoon - Launching transformation [Cloudera1]...
2015/02/18 15:07:28 - Spoon - Started the transformation execution.
2015/02/18 15:07:28 - Cloudera1 - Dispatching started for transformation [Cloudera1]
2015/02/18 15:07:28 - CSV file input.0 - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unexpected error
2015/02/18 15:07:28 - CSV file input.0 - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : org.pentaho.di.core.exception.KettleStepException:
2015/02/18 15:07:28 - CSV file input.0 - java.lang.NullPointerException
2015/02/18 15:07:28 - CSV file input.0 - at java.lang.Thread.run (Thread.java:744)
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.trans.step.RunThread.run (RunThread.java:62)
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.trans.steps.csvinput.CsvInput.processRow (CsvInput.java:80)
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.trans.steps.csvinput.CsvInputMeta.getFields (CsvInputMeta.java:361)
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.core.row.RowMeta.addValueMeta (RowMeta.java:161)
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.core.row.RowMeta.renameValueMetaIfInRow (RowMeta.java:505)
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.core.row.RowMeta.searchValueMeta (RowMeta.java:465)
2015/02/18 15:07:28 - CSV file input.0 -
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.trans.steps.csvinput.CsvInputMeta.getFields(CsvInputMeta.java:381)
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.trans.steps.csvinput.CsvInput.processRow(CsvInput.java:80)
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2015/02/18 15:07:28 - CSV file input.0 - at java.lang.Thread.run(Thread.java:744)
2015/02/18 15:07:28 - CSV file input.0 - Caused by: java.lang.NullPointerException
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.core.row.RowMeta.searchValueMeta(RowMeta.java:465)
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.core.row.RowMeta.renameValueMetaIfInRow(RowMeta.java:505)
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.core.row.RowMeta.addValueMeta(RowMeta.java:161)
2015/02/18 15:07:28 - CSV file input.0 - at org.pentaho.di.trans.steps.csvinput.CsvInputMeta.getFields(CsvInputMeta.java:361)
2015/02/18 15:07:28 - CSV file input.0 - ... 3 more
2015/02/18 15:07:28 - CSV file input.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1)
2015/02/18 15:07:28 - Cloudera1 - Transformation detected one or more steps with errors.
2015/02/18 15:07:28 - Cloudera1 - Transformation is killing the other steps!
2015/02/18 15:07:28 - Cloudera1 - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Errors detected!
2015/02/18 15:07:28 - Spoon - The transformation has finished!!
2015/02/18 15:07:28 - Cloudera1 - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Errors detected!
2015/02/18 15:07:28 - Cloudera1 - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Errors detected!


I have been searching around for a while, but not able to understand.

Problem connecting to MS SQL SERVER ....... giving error on user name and login

$
0
0
Hi

I am using this pentaho kettle and is a very new user of pentaho kettle . I am trying to connect to MS SQ L server trough Pentaho kettle but its giving me error on log in user name and credentials ... i am using these credentials at many other places but the only problem is in pentaho kettle

The message is shown below ... any help would be much appreciated

Error connecting to database [SQL SRVER CONNECTION] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database


Error connecting to database: (using class com.microsoft.sqlserver.jdbc.SQLServerDriver)
Login failed for user 'pentaho_test_user1'. ClientConnectionId:e15a8f36-944a-4f74-a943-26545c5a1639




org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database


Error connecting to database: (using class com.microsoft.sqlserver.jdbc.SQLServerDriver)
Login failed for user 'pentaho_test_user1'. ClientConnectionId:e15a8f36-944a-4f74-a943-26545c5a1639




at org.pentaho.di.core.database.Database.normalConnect(Database.java:427)
at org.pentaho.di.core.database.Database.connect(Database.java:361)
at org.pentaho.di.core.database.Database.connect(Database.java:314)
at org.pentaho.di.core.database.Database.connect(Database.java:302)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:80)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2685)
at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:546)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.swt.tags.SwtButton.access$500(SwtButton.java:43)
at org.pentaho.ui.xul.swt.tags.SwtButton$4.widgetSelected(SwtButton.java:138)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:389)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:318)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseDialog.open(XulDatabaseDialog.java:116)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.open(DatabaseDialog.java:59)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.newConnection(SpoonDBDelegate.java:464)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.newConnection(SpoonDBDelegate.java:451)
at org.pentaho.di.ui.spoon.Spoon.newConnection(Spoon.java:8736)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:43)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:106)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1310)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7931)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9202)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:648)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error connecting to database: (using class com.microsoft.sqlserver.jdbc.SQLServerDriver)
Login failed for user 'pentaho_test_user1'. ClientConnectionId:e15a8f36-944a-4f74-a943-26545c5a1639


at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:572)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:410)
... 55 more
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'pentaho_test_user1'. ClientConnectionId:e15a8f36-944a-4f74-a943-26545c5a1639
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:216)
at com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:254)
at com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:84)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:2908)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:2234)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.access$000(SQLServerConnection.java:41)
at com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:2220)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:5696)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1715)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:1326)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:991)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:827)
at com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:1012)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:547)
... 56 more

Host name : cannot share
Post: 1433

Cannot get scheduled reports for yesterday to work

$
0
0
I am a new user of pentaho and evaluating it as a possible BI solution for us. I have found it to be both amazing and really frustrating. Right now a show stopper for us is the scheduling of reports. I need to run a report today for yesterday which to me seems terribly basic need but it keeps running for the last time we interactively ran it for yesterday. I am trolled the forums and seen other people having the same issue but no answers that worked.

Really appreciate any advise, help doco etc

Cheers
Bruce

How can I set the length of a new field in the row normalise?

$
0
0
I am trying to normalise data that is 100 chars long, But I dont see anywhere to set the field length of my new field. It is defaulting to 25 chars and truncating my fields.

PDI for Data migration from Greenplum to Hadoop

$
0
0
Him,
I'm a newbie and I stumbled upon PDI as I was looking for ways to migrate data from GP(Greenplum) to Hadoop. I was initially looking at SQOOP to do this. I think PDI provides a higher abstraction and would like to try it out. Could you point me to the right documentation. I have installed the trial verison of PDI EE.

Thanks
HK

Red Arrow Comments in Report (Exported in Excel Format)

$
0
0
Is it possible to generate (and attach) comments to individual data cells when a PRD report is exported to the Microsoft Excel - i.e. xls or xlsx - format? I have tried multiple different approaches - none of which have actually worked. For example, the tooltip attribute in the swing-event category will generate a generic tool-tip when you hover over a cell in the Preview generated for the report by PRD; however, this does not translate over to the excel-formatted version of the report.

Please Note: I am explicitly referring to the "red-arrow comments" (created manually by right-clicking an arbitrary cell, and choosing the "Insert Comment" option in the context menu). These comments behave like resizable tooltips that appear when the user hovers over the top-right corner of the cell.

Pentaho 5.3 CE - Remove Security by Allowing Anonymous Access

$
0
0
Hi,

I'm trying to remove security by allowing anonymous access using this link:

https://help.pentaho.com/Documentation/5.3/0P0/150/040

But it keeps poping up a box asking for user and password.

I could not find this lines in the pentaho.xml file:

1 <pentaho-system>
2 <!-- omitted -->
3 <anonymous-authentication>
4 <anonymous-user>anonymousUser</anonymous-user>
5 <anonymous-role>Anonymous</anonymous-role>
6 </anonymous-authentication> <!-- omitted -->
7 </pentaho-system>

What can I do?

Thanks !

Emannuel Roque

Carte: making logs persistent?

$
0
0
Hello,

I'm trying to get my head around a proper use of the carte slave server feature.
For that, I need to be able to view the resulting logs of a remotly executed job at any time.
But as far as I can see, the logs on the carte server are only held until a reboot of the server, am I right?
At least the carte web interface (http://myserver:myport/kettle/status) is reset with every reboot

on the master Server, I can't see any logs about jobs that are executed on the slaves...

Am I missing an option to make the remote carte logs persistent?

Thanks in advance
Doogie

Carte: missing date/time in status-xml

$
0
0
Hello,

I'm using carte to remotly execute jobs, and the carte web status page to check the status of these jobs.
For a custom monitoring/alerting of these jobs, I wanted to automatically read and process the XML information of the status page (http://myserver:myport/kettle/status/?xml=y)

But to my dissapointment, there is a vital information missing, that is the date and time of the executed job!
In the html status page, it's present, but not in the xml view.


Any idea where I can find this information for automatic processing of the status page?


thanks in advance
Doogie

Prostgres BIT Data type not allowing to Drive Data through Kettle

$
0
0
Hi Guys,

I am trying to drive data in to Postgres DB Bit(1) Datatype.Its not allowing me to insert data.I am using "select values" stage to to cast datatype as integer/number/boolean and others.

Kindly help on this issue.

Reachme@sureshreddy.d@gmail.com

Configuring Mondrian on JBOSS server.

$
0
0
Hi All,

I need your help to configure the mondrian on Jboss server. what are the prerequisites for it?

Any configurations need to be changed?


Can anyone suggest/guide me with sample working source that is configured on JBOSS/steps for configuration for the same.


Any help here is highly appreciated.


Thanks in advance.



Best Regards,
Bhargava.

Nothing happens while trying to import .ktr file

$
0
0
After looking in the FAQ section and using the search function without finding any info helping me I post this new thread.
Am using Kettle - Spoon - 4.1.0 and yes I'm aware of that it is an older version, but I need to use it since it is bundled with the BMC Atrium CMDB version I need to integrate with.

I've tried this using Spoon on the server with 64-bit Win OS and on client PC with 32-bit Win OS, both with the same result.

Am trying to import a .ktr file in Spoon by File > Import from an XML file. When I've marked the file to import and click Open nothing seem to happen. No message, no transformation/job pop up, nothing happens.
Any idea why the import is not working? Any input is very welcome here.

Thanks in advance
Karl =)

Pentaho MDX Reports (and dahboards, whatever) with All member

$
0
0
Got this question recently - and figured to do a blog post on it.

How to include the all members as part of a report's prompt and get the results out of it?
This is basically a followup question from what Diethard Steiner blogged about some years back. But still very useful. And all is done with some MDX query magic.

The report

So here's what we'll have in the end:



And this is built in a way that if I select a different level in the hierarchy, for instance, classic cars, the output will intermediately reflect that:


The editor

This sounds like a very complex report, with a few subreports for the inner queries, right?

Well, no. Here's the report:


Sorry, but I can't imagine something simpler! Now, how do we get from here to the report I showed before?

Step 1 - The parameters

It all starts with how we pass the parameters. The trick is to make sure we get the fully qualified member name (that is, [Products].[Classic Cars].[Gearbox Collectibles] instead of just Gearbox Collectibles) .

Here's the query I used:

with
member [Measures].id as Product.currentMember.uniqueName
member [Measures].[name] as Product.currentMember.Name
select {Measures.id, Measures.[name]} on Columns,
Descendants([Product].[All Products],2,SELF_AND_BEFORE) on Rows
from [SteelWheelsSales]
This will populate the prompt with all the members. The trick here is to use the descendants function in conjunction with the uniqueName property.

Step 2 - The query

Now we need to prepare the query to receive those arguments:

With
member [Measures].[Name] as [Product].currentMember.Name
member [Measures].[Level] as [Product].currentMember.Level.ordinal

select {Measures.[Name],Measures.[Level],Measures.Sales, Measures.Quantity} on Columns,
Descendants(Parameter("territory",[Product],[Product].[All Products]),
[Product].[Vendor],SELF_AND_BEFORE
)
on Rows
from [SteelWheelsSales]
I could shape this query anyway I'd like - in here I want my parameter to select not a single aggregated row but every children member as well. The special mdx function parameter allows me to have a query that not only will work in design time but will also correctly receive the parameter. I could have used ${territory} as well, but honestly forgot how to use it without losing the preview / design time ability :)

You'll see some (not so) minor details here:

  • I'm explicitly asking for the name: that's because PRD would give me different cells for the different levels, and I just want the name. This way I can drag to my report just that field and will always show me the name. In CDE / CDA this wouldn't be needed, as we have a different way to parse the mdx resultset by default there
  • I'm using a slightly different syntax of the Descendants, where instead of specifying how many levels down we need to go, we specify the destination level
  • I'm also getting the level number so I can use it to conditionally change the layout - add the indent and change the font size

Step 3 - the layout

As shown before, the layout is very simple; just the 3 fields in there:

Step 4 - the conditional formatting

Two extra things that I did. On the Measures.[Name] I conditionally formatted the font size and the left padding, respectively:

  • =10+ 3-["[Measures].[Level]"] (font size)
  • =["[Measures].[Level]"]* 10 (left padding

All this joined together, we get the full report!


Have fun!



More...
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>