Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Use batch update for inserts doN't work for SQL Server jdbc

$
0
0
Hi All
Kettle 7 , MS-SQL 2014


I copied sqljdbc4.jar to \Lib folder

So connection to Ms-SQL works OK.


Use batch update for inserts
It does not depends whether this checked are not.
Result is SAME.

(I expeceted :
if it is checked and on record of batch is bad (falls due to constrints) whole batch will be DISCARDED.)


I checked it be result and also seeing SQLs in Profiler.

exec sp_execute 3,N'gjee-desc',N'PRD-OGP',396.69999999999999,N'O',998
go
exec sp_execute 3,N'slnp-desc',N'PRD-JEC',248.09999999999999,N'A',999
go
exec sp_execute 3,N'qnfy-desc',N'PRD-ICU',848.79999999999995,N'U',1000
go
... - portion on <commit size> value in Table Output.

IF @@TRANCOUNT > 0 COMMIT TRAN



from Help:
Note: There are limiting factors depending on the used database type and further step options. The batch mode is only used when this check box is selected
1.and the Commit Size is greater than 0
2.and the Return auto-generated key option is not enabled
3.and the transformation is not enabled to use unique connections (Transformation settings / Misc / Make the transformation transactional)
4.and the following rule is false: the database type supports safe points and step error handling is enabled
(see the database feature list to check if the database supports safe points)

5.and the database type supports batch updates (see the database feature list to check this)

1.2.3 - are OK
4. ???? - not sure

It can be problem of JDBC driver ( not support batch updates ???? )
How Batch should work for MS-SQL ?

Screen attached.

ps Generally this behavior is OK for me
(All bad records can be dircted to error output)
but I want to understand how it works.
Attached Images

data picker params in report

$
0
0
Hello,

I have a problem with a report that include a data picker like a param. In pdi version 7 the report does work but when publish it in server (pentaho 7.0) it shows me "fatal error; error analyzing parameter" when I try to introduce a value in data picker, then I can run the report
However it works perfectly when I publish it in pentaho v.6

I don't know how to resolve that. Any idea?

Pentaho 7.0 embedded java application with iframe

$
0
0
Hi,

I migrated my pentaho server since version 5.3 to version 7.0. It seems that user and password into URL don't work now in this new version :(. That's why now i use cookie authentification for pentaho login but when I try to access with an iframe inside JAVA application for 1 report i have a security error :

URL pentaho : http://pentaho:8080/pentaho/java_app...C4EB1FE&mode=1


Uncaught DOMException: Blocked a frame with origin "http://pentaho:8080" from accessing a cross-origin frame.
at Object.createRequiredHooks (http://pentaho:8080/pentaho/content/....js:372:255985)
at Object.load (http://pentaho:8080/pentaho/content/....js:372:244031)
at http://pentaho:8080/pentaho/content/...js:372:1015592
at Object.execCb (http://pentaho:8080/pentaho/content/...uire.js:29:390)
at Z.check (http://pentaho:8080/pentaho/content/...uire.js:18:423)
at Z.<anonymous> (http://pentaho:8080/pentaho/content/...quire.js:23:27)
at http://pentaho:8080/pentaho/content/...quire.js:8:102
at http://pentaho:8080/pentaho/content/...uire.js:23:461
at v (http://pentaho:8080/pentaho/content/...quire.js:7:173)
at Z.emit (http://pentaho:8080/pentaho/content/...uire.js:23:432)

How can i workaround this problem ?

Thanks for your quick answer.

Regards

How to identify the instances after the lift chart?

$
0
0
Hi,
  • I ran a J48 and recieved results in a CSV file.
  • I launched the "Cost/Benefit" Analysis, entered values for Profit (in TP) and Cost (in FP) and got the values for Maximised Cost/Benefit. I have now the % Of Population (24.3) , % Of Target (88.9) and Score Threshold (0.161).


I need to find the specific instances whom I would like to target for my campaign. How do I do that?
What do I do with the CSV file?

Thank you,
Tamir

Installing Server CE version 7: I need s serious help.

$
0
0
- I want to install the bi server CE 7, and start using some hypercubes I have created: initially as csv files I will upload (let's keep it simple), later by some data connection.
- I have been trying that for several days and although it seems "almost" working, it is not working

- apparently the install is a no-brainer for most people, so I have propbably made some mistakes (hint: I'm 99% ignorant at Java and Tomcat)

Current situation:
- debian 8.6 in a LXC container
- Java:
java version "1.8.0_111"
Java(TM) SE Runtime Environment (build 1.8.0_111-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.111-b14, mixed mode)
- created the linux user "pentaho", placed it in sudo group as well
- installed from pentaho-server-ce-7.0.0.0-25.zip by unzipping and moving the generated output: it is now in /home/pentaho/pentaho-server
- changed owner:group to "pentaho" in the whole directory tree under /home/pentaho
- configuration I get from the kettle sample:
Action Successful

Variable
Value
Internal.Kettle.Version
7.0.0.0-25
Internal.Kettle.Build.Version
1
Internal.Kettle.Build.Date
2016-11-05 15.35.36
java.runtime.version
1.8.0_111-b14
os.name
Linux
os.version
4.4.35-1-pve
os.arch
amd64
user.country
US
user.language
en
user.home
/home/pentaho
KETTLE_HOME
${KETTLE_HOME}
pentaho.solutionpath
solution:
Internal.Transformation.Filename.Directory
Internal.Transformation.Filename.Name

Pentaho Interactive Reporting Date time usage

$
0
0
Hello Everyone,

I am a new Pentaho user and am evaluating if we can use Pentaho for our reporting. I am looking to filter
data for an interactive report where a user can select data based on date and time range. Currently when
I use a Date Picker or even a text box to select / enter time, by default it rounds the date to yyyy-MM-dd
and does not consider the hours, minutes and seconds. I want to be a able to filter data when I compare
to a date column to select data within the hours, minutes, seconds etc. Is there a way we can do that?

Any suggestions would be much appreciated. Pleae let me know if you need any further information.

Database: Oracle
Filed datatype: Date
Pentaho Interactive Report
Version: Release 4.8.2

Regex / Replace in string - Explanation on mechanism sought

$
0
0
Hi,

Can anyone help me to understand this mechanism better? I've think I've come up the curve on the regex building but still stumbling on this process. I am trying to extract one part of the string in a column. I feel like I should be able to do this, but is it a requirement that I define an expression which is collectively exhaustive with respect to the strings in the subject field? This and grouping may be the source of my issues.

My current regex:

(^\s)(\d{6,})?(\w.+)?

Example column values (the real data doesn't have quotation marks):


  1. " 3rd party advances consumed related to ZQ123"
  2. " 139888"
  3. " Personnel Costs"
  4. " 12345678"


I'm asking it to replace with $2 as a new field

I would like to return values (fror strings 1 to 4 above)



  1. Null
  2. 139888
  3. Null
  4. 12345678


Should I be able to do this? What actually does the Regex Evaluation step do by comparison with the Extract from String step with a regular expression?

Thanks,

Andy

how to create Repository in 6.1 sever

$
0
0
hi i am using Pentaho bi server 6.1

let me know how to create repository in bi server

pls.......pls.....

Table Output Step in PDI resulting in java.lang.ArrayIndexOutOfBoundsException

$
0
0
Hi,

I am currently using Kettle PDI version 5.2.02. My transformation which consists of table output step fails intermittently throwing the following exception:


2017/01/16 22:59:02 - Stage Scala Data.3 - Error inserting row into table [scala_pstn_stg_v] with values: [2017/01/16 22:56:25.000000000], [2017/01/16 22:56:24.000000000], [2], [2017/01/16 00:00:00.000000000], [D], [3787], [0062], [TRDG], [0062], [NAP], [10025113], [001], [0634], [M], [0.000010], [20170116], [0], [0], [0], [-0], [-0], [-0], [-0], [-0], [-0], [4445], [1], [L], [TRD], [1], [2], [STK], [10025113], [0], [1199057357]
2017/01/16 22:59:02 - Stage Scala Data.3 -
2017/01/16 22:59:02 - Stage Scala Data.3 - Unexpected error inserting/updating row in part [insertRow exec update]
2017/01/16 22:59:02 - Stage Scala Data.3 - 6
2017/01/16 22:59:02 - Stage Scala Data.3 -
2017/01/16 22:59:02 - Stage Scala Data.3 -
2017/01/16 22:59:02 - Stage Scala Data.3 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:377)
2017/01/16 22:59:02 - Stage Scala Data.3 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:118)
2017/01/16 22:59:02 - Stage Scala Data.3 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2017/01/16 22:59:02 - Stage Scala Data.3 - at java.lang.Thread.run(Thread.java:745)
2017/01/16 22:59:02 - Stage Scala Data.3 - Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
2017/01/16 22:59:02 - Stage Scala Data.3 - Unexpected error inserting/updating row in part [insertRow exec update]
2017/01/16 22:59:02 - Stage Scala Data.3 - 6
2017/01/16 22:59:02 - Stage Scala Data.3 -
2017/01/16 22:59:02 - Stage Scala Data.3 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1272)
2017/01/16 22:59:02 - Stage Scala Data.3 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:255)
2017/01/16 22:59:02 - Stage Scala Data.3 - ... 3 more
2017/01/16 22:59:02 - Stage Scala Data.3 - Caused by: java.lang.ArrayIndexOutOfBoundsException: 6
2017/01/16 22:59:02 - Stage Scala Data.3 - at oracle.jdbc.driver.VarnumBinder.big5pow(OraclePreparedStatement.java:15348)
2017/01/16 22:59:02 - Stage Scala Data.3 - at oracle.jdbc.driver.VarnumBinder.constructPow52(OraclePreparedStatement.java:15420)
2017/01/16 22:59:02 - Stage Scala Data.3 - at oracle.jdbc.driver.VarnumBinder.dtoa(OraclePreparedStatement.java:15884)
2017/01/16 22:59:02 - Stage Scala Data.3 - at oracle.jdbc.driver.DoubleBinder.bind(OraclePreparedStatement.java:17239)
2017/01/16 22:59:02 - Stage Scala Data.3 - at oracle.jdbc.driver.OraclePreparedStatement.setupBindBuffers(OraclePreparedStatement.java:3150)
2017/01/16 22:59:02 - Stage Scala Data.3 - at oracle.jdbc.driver.OraclePreparedStatement.processCompletedBindRow(OraclePreparedStatement.java:2368)
2017/01/16 22:59:02 - Stage Scala Data.3 - at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3592)
2017/01/16 22:59:02 - Stage Scala Data.3 - at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3678)
2017/01/16 22:59:02 - Stage Scala Data.3 - at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeUpdate(OraclePreparedStatementWrapper.java:1352)
2017/01/16 22:59:02 - Stage Scala Data.3 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1235)
2017/01/16 22:59:02 - Stage Scala Data.3 - ... 4 more


This happens intermittently and is not consistent. The Oracle version is 11.2.0.4. This error occurs on the production instance and is not reproducible unfortunately on the same code base on the SIT and UAT environments.

Has someone faced this kind of issue before ?

Unable to get XML data of intermediate level

$
0
0
I'm trying to read a three-level XML file:

HTML Code:

<?xml version="1.0" encoding="ISO-8859-1"?>
<CAB>
<NUMPED>ped1</NUMPED>
<LINALB>
<REFPROV>prov1</REFPROV>
<LOTE>
<NUMLOTE>lote1</NUMLOTE>
</LOTE>
</LINALB>
<LINALB>
<REFPROV>prov2</REFPROV>
<LOTE>
<NUMLOTE>lote2</NUMLOTE>
</LOTE>
</LINALB>
</CAB>

Loop XPath: /CAB/LINALB/LOTE

No problem reading the detail field
XPath: NUMLOTE

No problem reading the CAB field
XPath: //NUMPED

But no way to get the correct /LINALB data. I tried:

XPath: /REFPROV -> I get a null value
XPath: /CAB/LINALB/REFPROV -> I get "prov1" for all lines instead of the correct "prov1" on the first line and "Prov2" on the second one.
Xpath: //REFPROV -> I get "prov1" for all lines instead of the correct "prov1" on the first line and "Prov2" on the second one (I expected a null or an error on this one ...)

Any idea what I'm doing wrong?

Thks

Importing CSV file with decimal

$
0
0
Hi!

I can't seem to import a simple csv file that looks like this:
Date,Open,High,Low,Close,Volume,Adj Close
2017-01-12,200.30,20060,19750,19780,1186400,19780

The problem occures with the "." in the second field.
I have set the field as numeric and tried different formats but nothing works (#.# etc). I've even tried to save the file in different codings like UTF8, ANSI etc with no result. At first I got a error like "the fourth character is not a numeric", but now I just get a empty error message. I'm using the data source wizard...

Any ides?

BR
Daniel

Microsoft Excel Writer very poor support for metada injection.

$
0
0
"Microsoft Excel Writer" don´t support filename injection. Why ??

Capturing Step Metrics for a Job

$
0
0
Hi all,
I am seeking to capture step metrics on a shell script that I am running.
The problem is that the shell script step is only available within a job, but output step metrics is only available from within a transformation.
When I create a transformation that will execute a job, it's not capturing actual metrics for that job run, nor even it's success or failure to run properly. How can I bridge the gap between the two?

Attached are the job and transformation I'm running (script it's calling in the job will be non-functional)
Attached Files

PDI 7 - Spoon.bat - nothing launches on Windows 7

$
0
0
Hey there--

Windows 7 computer - PDI 5.1, 5.4 and 6.0 will launch on it (though 6 takes forever). 7 seemingly has no response when I launch spoon.bat.



I have another system/ computer running Windows 10 that runs PDI 7 just fine ...
Not sure what's going on here --- all the settings are the same --- why won't it launch? I'm using Java Version 8 Update 91 on the Windows 7 computer with the problem here. (Update 101 on the Windows 10). Both are 64 bit.


What kind of diagnostics can I do here?

Connection Timed Out

$
0
0
What would cause a "connection timed out" error message?

When trying to run something on the server, it says it can't open something in a repository (I know it's there) and then gives me a "connection timed out" error message.

Why do you think it is having a hard time opening something on the repository?

unable to create CrossTab report

$
0
0
Hi,

I'm trying to create a report with a crosstab in it. I think I\m missing something but I don\t know what.
I attached a test report with an Excel file with some data in it.
The error is not specifying the problem:
  • org.pentaho.reporting.engine.classic.core.ReportProcessingException: Failed to process the report


Can you check it and tell me what is the problem?

The Report Designer version is 7.0.0.0.25.

Thanks,
Andrei
Attached Files

Error connecting to FTPS server from local system.

$
0
0
Hi,

I'm using " Get Files via FTPS " on Pentaho Data Integration and after insert IP address, user and password, it's getting " ftp IO exception return value 530 description error connection " Can someone say if is a problem with the connection or I'm missing something at this ETL.

Please find the below attached image for your reference.
Attached Images

Query components this/metadata

$
0
0
HEllo,

I have a CDE dashboard that is running wihtuot any issue on pentaho 5.4 CE.

I tried to migration it to pentaho 7.0, but "Query component" is not working properly anymore.

In the component in Post Execution I use
this.metadata
this.queryInfo
this.resultset
However these are now undefined.

So, what is now the correct way to get details of my query?

Do not hesitate if you need more details.

Thanks by advance for your help.
Brgds
Bertrand

i can't acces to penta data integration

$
0
0
Hello,

I have a problem, pentaho data integration is hosted on a server and i don't know how to access it.
I can access only to url XXX:8080: pentaho

Help me please

Is hosted on a server and I do not know how to access itIs hosted on a server and I do not know how to access itIs hosted on a server and I do not know how to access itIs hosted on a server and I do not know how to access it

Enable Relative Date Filters using a language other than English

$
0
0
Hi,
I usually use this technique (https://help.pentaho.com/Documentati...20/070/030/000) for my date dimension and it works correctly.
Here the dimension's definition:

Code:

  <Dimension type="TimeDimension" visible="true" highCardinality="false" name="Date">
    <Hierarchy name="Date" visible="true" hasAll="true" primaryKey="date_id">
      <Table name="date_dim">
      </Table>
      <Level name="Year" visible="true" column="year" ordinalColumn="year" type="Integer" uniqueMembers="true" levelType="TimeYears" hideMemberIf="Never">
          <Annotations><Annotation name="AnalyzerDateFormat">[yyyy]</Annotation></Annotations>
      </Level>
      <Level name="Year Quarter" visible="true" column="quarter_of_year" ordinalColumn="quarter_of_year" type="String" uniqueMembers="false" levelType="TimeQuarters" hideMemberIf="Never">
        <Annotations><Annotation name="AnalyzerDateFormat">[yyyy].[yyyy 'Q'q]</Annotation></Annotations>
      </Level>
      <Level name="Year Month" visible="true" column="month_of_year" ordinalColumn="year_month_number" type="String" uniqueMembers="false" levelType="TimeMonths" hideMemberIf="Never">
        <Annotations><Annotation name="AnalyzerDateFormat">[yyyy].[yyyy 'Q'q].[yyyy MMM]</Annotation></Annotations>
      </Level>
      <Level name="Full Date" visible="true" column="date" ordinalColumn="date_id" type="Date" uniqueMembers="false" levelType="TimeDays" hideMemberIf="Never">
          <Annotations><Annotation name="AnalyzerDateFormat">[yyyy].[yyyy 'Q'q].[yyyy MMM].[yyyy-MM-dd]</Annotation></Annotations>
      </Level>
    </Hierarchy>
  </Dimension>

Now I need to create a date dimension containing some values in a differente language (italian in this case).
The dimension's definition in the schema doesn't change, the only things that change are the values of the month_of_year column. For example "2017 Jan" becomes "2017 Gen".

After populating in this way the table date_dim the date filters in mondrian does't work anymore.
If I try to filter, for example, for the previous 2 Full Date, I can see that the MDX query is setting the filter in this way:

Code:

SET [*BASE_MEMBERS__Date_] AS '{[Date].[2017].[2017 Q1].[2017 Jan].[2017-01-16],[Date].[2017].[2017 Q1].[2017 Jan].[2017-01-17]}'
Of course this cannot work because the value of the level "Year Month" is "2017 Gen" and not "2017 Jan".

What can I do to make this filter work?

Thank you

Clotis
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>