Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Interactive Reports

$
0
0
Hallo,

In Pentaho 6.0, User Console- Interactive Reports:



- Is it possible to create a Filter (or a Prompt) on a field while having a group by on another field? In other words why if there is a Group by field, it is not possible to add a filter (or a prompt) on another field?

Note: The filter (or prompt) can only be applied on the same Group by field or if the filter is as a column.


I’m getting 3 kinds of error Messages:

1- If there is already a group by and I want to add a filter:


Unable to create Filter: you will need to remove the Group(s) or add the filter field as a column in the Report

2- If there is already a group by and I want to add a Prompt:


Unable to create prompt: you will need to remove the Group(s) or add the Prompt field as a column in the Report

3- If there is already a filter or a Prompt and I want to add a group by field:


Unable to create Group: you will need to remove the filter or add the filter field as a column in the Report



There is a sample example from Pentaho (Vendor sales report) and it works, but if a group by, filter or prompt field is added (or an old one is deleted and added again) one of the 3 error messages appear again




Can anyone help?



Thanks in advance

why does my BI server 4.8 does not has dashboard creation option?

$
0
0
Hi

I'm very new to reporting as well as pentaho. I have to build dashboards, in all the tutorials I saw there comes an option for new dashboard but there is nothing like that in the coomunity edition 4.8 version. Can anyone tell me is it because I'm missing plugins? or version issue? or what?

Thanks in advance

Complex calculations

$
0
0
I just got acquainted with Mondrian and scanned the documentation. Before I really start installing/learning how to use Mondrian, I want to know if (and how) complex calculations can be performed.

Suppose a transport company, shipments will be loaded in week 1 and unloaded in week 2. The user wants to know only a single fact: KgPerShipment. And has one dimension: Week (plus grand total).

The culculation of a cell would something like: Average of {select ShipmentNo, Max(Kg) from Shipments where Kg >0 group by ShipmentNo}.
Of course only shipments of that cell should be included.

There is something like user-defined functions, but it looks like an alterative to sum, avg, min, etcetera.
Can the formula in CalculatedMember handle such a calculation or is there another way to handle this?

Hadoop File Input with Snappy compression

$
0
0
I have some compressed files in Hadoop and want to read them from my Kettle/PDI 6.0.0.0, but I'm not able to do so.


My step "Hadoop File Input" shows "Hadoop-snappy" and "Snappy" as compression options:
- using Hadoop-snappy options it shows me:
Code:

java.lang.Exception: Hadoop-snappy does not seem to be available
- using Snappy options it shows me:
Code:

java.io.IOException: FAILED_TO_UNCOMPRESS(5)



Please not that I've already set up the cdh54 plugin to match the cluster configuration.


So, what could be the problem?

Paired Bar-Line Measures (Dual y axis bar chart with trend line)

$
0
0
Hi guys,
I am trying to achieve this chart - http://www.webdetails.pt/ctools/ccc/...-line-measures
I tried this chart using Pentaho CDE CCC bar chart and its properties rather than going for whole function in pre-execution. What I am not able to achieve is trend line getting plotted according to one y-axis and bar chart getting plotted according to other y-axis. I think here this part of the function is doing the above mentioned thing:

dimensions: {
// Explicitly define the "measure" dimension
// (change the defaults that would otherwise take effect)
measure: {
// Hide "measure" from the tooltip
isHidden: true,


// Fine tune the labels
formatter: function(v) {
switch(v) {
case 'Count': return "Count";
case 'AvgLatency': return "Avg. Latency";
}
return v + '';
}
}
},


calculations: [{
// Split rows into != data parts,
// depending on the "measure" dimension's value.
names: 'dataPart',
calculation: function(datum, atoms) {
atoms.dataPart =
datum.atoms.measure.value === 'Count' ?
'0' : // main plot: bars
'1' ; // second plot: lines
}
}],


How should I achieve that using CTools properties or function??:confused:

Upload report to BA server

$
0
0
Hello,

I am currently working on a project where I need to upload a report to the BA server repository (URL: ***/api/repo/publish/file). I would like to achieve this by using the HTTP Post step and an User Defined Java Class step that produces the request entity field.
However I didn't manage to come up with a working code. Since my boss does not want me to use external libraries, I am sticking to the org.apache.commons.httpclient classes which are deployed with kettle.
My approach is to create a Part[] array containing the FilePart and StringParts. The next step is to create a MultipartRequestEntity which is then writen to a ByteArrayOutputStream.

Code:

File filePart = new File(fileReport);FilePart fileUpload  = new FilePart("fileUpload", filePart);
StringPart applyAclPermissions = new StringPart("applyAclPermissions","true");
StringPart overwriteAclPermissions = new StringPart("overwriteAclPermissions","true");
StringPart overwriteFile  = new StringPart("overwriteFile", "true");
StringPart logLevel = new StringPart("logLevel","TRACE");


StringPart retainOwnership = new StringPart("retainOwnership", "false");


StringPart fileNameOverride = new StringPart("fileNameOverride","blablub.prpt");


StringPart importDir = new StringPart("importDir", "/public");


Part[] parts = {
        fileUpload,
        overwriteFile,
        logLevel,
        retainOwnership,
        fileNameOverride,
        importDir
    };
       
HttpMethodParams params = new HttpMethodParams();
MultipartRequestEntity requestEntity = new MultipartRequestEntity(
    parts, params
);
       
ByteArrayOutputStream bOutput = new ByteArrayOutputStream();
requestEntity.writeRequest(bOutput);
String requestEntityValue = new String(bOutput.toByteArray());
String contentType = requestEntity.getContentType();
String contentLength = String.valueOf(requestEntity.getContentLength());


Object[] outputRow = createOutputRow(r, data.outputRowMeta.size());
get(Fields.Out, "requestEntityValue").setValue(outputRow, requestEntityValue);
get(Fields.Out, "contentType").setValue(outputRow, contentType);
get(Fields.Out, "contentLength").setValue(outputRow, contentLength);
putRow(data.outputRowMeta, outputRow);


return true;

In the next step the data is sent with the HTTP Post Step. However the server is not satisfied with this approach.

Do you guys have any idea what I am doing wrong?

Thanks for your help!

JobEntryWaitForSQL setNrErrors(1)

$
0
0
Hi,

I'd like to know why in method JobEntryWaitForSQL.execute, a call to

result.setNrErrors(1);

is done and the number of errors is never reset to 0 in case of succesfull completion.
Thanks,

Regards,
Nicolas

Oracle BLOB, CLOB AND XMLTYPE UNSUPPORTED ON 6.0.0 VERSION

$
0
0
Hi,
I have downloaded the 6.0.0 versión and I'm trying to extract a BLOB a CLOB and XMLTYPE from an Oracle database but
I'm getting this error:

Error determining value metadata from SQL resultset metadata For input string: "4294967295"

Does PDI suports those types of fields?

thanks.

No "Column Format" in Table Component - CDE 6

$
0
0
Hi,

I couldn't find "Column Format" parameter in Table Component.
I use CDE 6.0.0.0-353 on Pentaho BA 6.0.

Since how to apply "Column Format" shown on CDE tutorial "Creating Tables" section.

Thank you.

[bayu]

How to organize the data in a excel file?!

$
0
0
In a text file I get this:
seller; make; model; year; condition; price;
professional; audi; tt; 2000; used; 5000€;
seller; make; model; year; condition; price; fuel;
private; bmw; 118d; 2010; used; 15000€; diesel;
seller; make; model; year; price; fuel; kilometres;
private; audi; a3; 2007; 7500€; gas; 188 000km;
...

and I try to get this result, in output excel file:
| seller ----------| make | model | year | condition | price -----| fuel ----| Kilometres |
| professional | audi ---| tt ------| 2000 | used ------| 5000€ ---| ----------| -----------------|
| private --------| bmw --| 118d -| 2010 | used ------| 15000€ -| diesel -| -----------------|
| private --------| audi ---| a3 ----| 2007 | --------------| 7500€ ---| gas ----| 188 000km |
...

I've tried some components but without success ... Does anyone know if there is any component that does this transformation?!!

CDE Data Range Input component localization

$
0
0
Hello,

How can I add non-English terms to this component? I tried change daterangepicker's presetRanges, presetRanges in PostExecute but I got errors in onChange listener.

Chinese Data is not loading properly

$
0
0
kindly help me regarding Chinese data issue. I am loading data from Table input -- > Table output step.

Seems to be issue is there somewhere else.. might be in database setting or system settings. but i can give the clarity about my both SOURCE and TARGET database table and column UTF8 settings. Please ask me if you need more info.

my correct data : 目标/校准(调整 and loaded wrong data: 目标/æ ¡å‡â€

SOURCE COLUMN utf8 setting:




TARGET COLUMN utf8 setting:

Insert/Update: Insert if not exist or update if at least one column changes

$
0
0
Hi,

I want to perform a insert or update (syncronization between source and target tables) attending to a several columns:

Take for example the following scenario:

source Table: marketA

ID_B ProductA PriceA
1 Banana 10
2 Apple 4

target Table: marketB

ID_B ProductB PriceB
1 Banana 10
2 Apple 4

In this moment as you can see the target table(marketB) has the same content with target A.

If i change in my source marketA the product name and/or price, i would like to update my target table:

source Table: marketA

ID_A ProductA PriceA
1 Banana 10
2 Apple -> Golden Apple
4-> 6

it is excepted that an update on target table (marketB) should be perfomed,like:

target Table: marketB

ID_B ProductB PriceB
1 Banana 10
2 Golden Apple
6
It will insert only if ID is different
I tried the Insert Update but doesn't seems to work. the keys to lookup the values should be i suppose:

TableField | Comparator | StreamField
ProductB | = | ProductA
PriceB | = | PriceA
ID_B | = | ID_A

Update fields:

TableField | StreamField

ProductB | ProductA
PriceB | PriceA


What am i missing?
Using Kettle PDI what is the best approach to perform a insert/update attending to the mentioned scenario knowing that will only update if at least one column that is not a primary key changes and insert if pkey doesn't exist on target table??

Any suggestion would be great
Thanks in advance.

Salesforce Insert/Upsert

$
0
0
Hi,
I was trying to use pentaho to load data into salesforce.
In salesforce i have a field which is of type number.
While loading data, intentionally i gave the value in one row as a string.
I was expecting that only this record would not be inserted, but to my surprise none of the records are getting loaded, and also this error is propagating to other non - error records as well.

Any ideas as to how to handle this?:(

Thanks in advance!!

Mondrian Member Formula Sum Error - Bug?

$
0
0
Hi,

I have a member formula to calculate accumulate value - AccumulateV of months, like this:

IIF([Period].CurrentMember=[Jan],[Jan],
IIF([Period].CurrentMember=[Feb],SUM({[Jan],[Feb]}),
IIF([Period].CurrentMember=[Mar],SUM({[Jan],[Feb],[Mar]}),[Period].CurrentMember)))

The result is like:

Jan 100 AccumulateV 100
Feb 200 AccumulateV 300
Mar 300 AccumulateV 600

BUT, if there are identical values in these months, say both Jan and Mar is 100, the result will be incorrect like:
Jan 100 AccumulateV 100
Feb 200 AccumulateV 300
Mar 100 AccumulateV 100

I tried functions like SUM, PeriodToDate, and explicit formula like Jan + Feb, all of them gave incorrect values when it contains identical value. Any hint on this?

Thanks!

Schema Workbench working with aggregated tables

$
0
0
Hi,

I am a newbie using Schema Workbench. I have defined a cube with the classical star schema (fact and dimensions).
I have one fact table, I want to define some aggregated tables over that table and I want that my cube use this aggregates when the SQL go faster using them.

Tha cube I have defined have this detailed fact table, how can I add aggregated tables?, do I have make any settings in the cube in order to use this aggregated when it makes better SQL performance??

Thanks in advance

First Steps pentaho - transformation

$
0
0
Hello,
I'm new in the forum and in the world of Pentaho.
I should do a project for an exam and I'm working by himself as the professor did lessons.
Should I create a data warehouse.
I created 4 database in MySQL with PHPMYADMIN and created a local server with MAMP.
Now should I create transformations.
To create tasformazioni with data from different databases, you should always create your connections to the repository (then Tools -> Repositories -> Connect to repository -> "+")?
In addition to the reconciling, I will have to create a new database instance for riconciliari tables "State" of the various DB I have to create a new database in which there is a single table "State" to go to put the results obtained from the processing?

DBF process 12.000.000 rows on a sequence?

$
0
0
Hi, i'm getting no clue for doing this.

i have to migrate 12.000.000 rows from a DBF to a MYSQL but my DBF to Table Output takes so long 3 hrs aprox.

any chance to do that but especifies the amount of rows example.

first 1.000.000 rows and later the other 1.000.000 and soo...

any hint for that?

Thanks in advice

Too many queries when cube with roles using mondrian in hadoop

$
0
0
Hello Folks,

Let me explain you the case : We are using a cube with multiple roles in this we have database as hadoop-spark,it is generating too many queries when i start saiku(please have a look in attached log file), Due to these queries it is taking too long time to start saiku(Approax 5 minutes) each time.

But when I remove all roles from my schema file(Cube) then it is working fine.

Now interesting thing is when I use same schema file(cube) with all roles and database as Oracle it is not generating any queries.

I am using pentahoCE 5.0.1 and saiku3.3 and mondrian 3.6.7.jar. I have tried with multiple versions of mondrian jar file but no any success. Also trying with multiple versions of pentaho.

Looking for your views on this, Thank you much to all.

Regards

Nitesh


Below is pentaho console log I have just added a part of the log for your reference it is generating multiple queries(Approx 1000) for each rolein schema file)


-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Steel...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
Catalogs:1
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Sampl...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Steel...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
Catalogs:1
Setting role to datasource:finsxbudgetsumexc2 role: Administrator,
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Sampl...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Steel...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
before parse for chinese alias:select version()
after parse for chinese alias:select version()
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code1 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc1 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where pacrpt_ofolapdata_fin_branch_dim.company_desc1 = '(999999)全公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code1, pacrpt_ofolapdata_fin_branch_dim.company_desc1 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code1 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc1 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where pacrpt_ofolapdata_fin_branch_dim.company_desc1 = '(999999)全公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code1, pacrpt_ofolapdata_fin_branch_dim.company_desc1 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code2 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc2 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc2 = '(999993)寿险中区汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code2, pacrpt_ofolapdata_fin_branch_dim.company_desc2 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code2 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc2 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc2 = '(999993)寿险中区汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code2, pacrpt_ofolapdata_fin_branch_dim.company_desc2 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999993' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(099999)湖北分公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999993' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(099999)湖北分公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code4 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc4 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code3 = '1099999' and pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999993' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc4 = '(090100)襄樊中心支公司' group by pacrpt_ofolapdata_fin_branch_dim.company_code4, pacrpt_ofolapdata_fin_branch_dim.company_desc4 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code4 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc4 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code3 = '1099999' and pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999993' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc4 = '(090100)襄樊中心支公司' group by pacrpt_ofolapdata_fin_branch_dim.company_code4, pacrpt_ofolapdata_fin_branch_dim.company_desc4 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code2 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc2 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc2 = '(999992)寿险南区汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code2, pacrpt_ofolapdata_fin_branch_dim.company_desc2 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code2 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc2 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc2 = '(999992)寿险南区汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code2, pacrpt_ofolapdata_fin_branch_dim.company_desc2 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999992' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(049999)广东分公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999992' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(049999)广东分公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code4 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc4 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code3 = '1049999' and pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999992' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc4 = '(040500)中山中心支公司' group by pacrpt_ofolapdata_fin_branch_dim.company_code4, pacrpt_ofolapdata_fin_branch_dim.company_desc4 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code4 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc4 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code3 = '1049999' and pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999992' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc4 = '(040500)中山中心支公司' group by pacrpt_ofolapdata_fin_branch_dim.company_code4, pacrpt_ofolapdata_fin_branch_dim.company_desc4 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code2 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc2 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc2 = '(999994)寿险北区汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code2, pacrpt_ofolapdata_fin_branch_dim.company_desc2 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code2 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc2 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc2 = '(999994)寿险北区汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code2, pacrpt_ofolapdata_fin_branch_dim.company_desc2 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999994' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(079999)大连分公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999994' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(079999)大连分公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999992' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(333333)new测试层级三' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999992' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(333333)new测试层级三' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999994' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(199999)黑龙江分公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999994' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(199999)黑龙江分公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code4 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc4 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code3 = '1199999' and pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999994' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc4 = '(190000)黑龙江分公司' group by pacrpt_ofolapdata_fin_branch_dim.company_code4, pacrpt_ofolapdata_fin_branch_dim.company_desc4 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code4 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc4 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code3 = '1199999' and pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999994' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc4 = '(190000)黑龙江分公司' group by pacrpt_ofolapdata_fin_branch_dim.company_code4, pacrpt_ofolapdata_fin_branch_dim.company_desc4 order by ISNULL(c0) ASC, c0 ASC
before parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999993' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(169999)云南分公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
after parse for chinese alias:select pacrpt_ofolapdata_fin_branch_dim.company_code3 as c0, pacrpt_ofolapdata_fin_branch_dim.company_desc3 as c1 from jt_of_safe.pacrpt_ofolapdata_fin_branch_dim pacrpt_ofolapdata_fin_branch_dim where (pacrpt_ofolapdata_fin_branch_dim.company_code2 = '1999993' and pacrpt_ofolapdata_fin_branch_dim.company_code1 = '1999999') and pacrpt_ofolapdata_fin_branch_dim.company_desc3 = '(169999)云南分公司汇总' group by pacrpt_ofolapdata_fin_branch_dim.company_code3, pacrpt_ofolapdata_fin_branch_dim.company_desc3 order by ISNULL(c0) ASC, c0 ASC
Catalogs:1
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Sampl...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Steel...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
Catalogs:1
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Sampl...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Steel...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
Setting role to datasource:finsxbudgetsumexc2 role: Administrator,
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Sampl...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Steel...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/finsx...
driver=mondrian.olap4j.MondrianOlap4jDriver
Setting role to datasource:finsxbudgetsumexc2_hadoop1_role role: Administrator,
-- listing properties --
location=jdbc:mondrian:Catalog=mondrian:/Sampl...
driver=mondrian.olap4j.MondrianOlap4jDriver
-- listing properties --

Kette - Query by date in MongoDB input

$
0
0
I have found multiple posts with this issue but found no real solution. So asking the question again

so i am getting a date output field from a input table - LMD.

i am converting the Date to string format using modified JS

var dt = date2str(LMD, "yyyy'-'MM'-'dd HH':'mm':'ss'Z'");.

and then using this variable in the query parameter of MongoDB input step

{ LMD : { $gte : { $date : "$dt" } } }

does not work. But it works if i hardcode the value.

{ LMD : { $gte : { $date : "2014-12-31T00:00:00.000Z" } } }
Am i missing something here?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>