Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Calculated measure made unavailable

$
0
0
Hi

I'm trying pentaho analyer (entreprise evaluation)

I added calculated measures in my analysis report. After having modified my model ( I added a field), all calculated fields are missing. The message is : some fields in this report have been made unavailable by administrator....etc.

What shall i do or that ?

Thank's in advance for any reply

Eric.

Validation check for date

$
0
0
Hi,

I am new to Pentaho..

I have a requirement to check if a particular date coming from source file is Valid or not and If it is in valid put default date as 1900-01-01.

E.g - If below are Dates in source file.

1) 2014-01-01
2) 2015-25-25
3) 2013-01-34

Output

1) 2014-01-01
2) 1900-01-01 ---- replaced with default date
3) 1900-01-01 ---- replaced with default date

Thanks in advance!!
-Santosh

Unable to read number '+002.323' with Fixed Input step

$
0
0
Hi,

I must read a text file with fixed length fields in it, and there are a bunch of decimal fields conforming to the following pattern:

  • 1st char: plus or minus sign.
  • 2nd - 4th char: integer part of the number
  • 5th char: decimal point
  • 6th - 8th char: decimal part of the number


Examples:
+100.000
-023.240
+000.000
+001.000


I cannot get the number recognized by the "Fixed Input" step.
Is there any solution without adding another step?

The only way I found to read the number is to set the field type to String and then try to process it in other step after this one.
Which would be the best solution for transforming the string field into the correct number field (take into account there are 30+ fields of this type in the record)?

how to implement Control base Queuing mechanism in BI-Server 5.x.x or latest

$
0
0
Hi All,

:(
I'm using Pentaho BI-Server ce-6.0.0.0-353.
in that, I want to implement Queuing Mechanism for execution of reports.
Reports will execute when the resources are free on Database Server

Can anyone tell me is it possible in Pentaho :confused: or need to use other tool for the same:rolleyes:??


thanks in advance.
:)


Reg,
Anand

PDI x Oracle Advanced Security (OAS)

$
0
0
PDI works with Oracle Advanced Security (OAS)? It will be used instead of JNDI connection.

I'm searching about it but i don't find anything.


Thanks.

Is Pentaho given an option to choose the mode of delivery of reports – email or FTP?

$
0
0
Hi All,

I'm using Pentaho BI-Server ce 5.4 and latest version 6.0.0.0-353.
in that, I want to implement/customize scheduling pop-up in which given an option to choose the mode of delivery of reports-email or FTP
and also check any mail which is over 5 MB will need to be delivered on an FTP mode only

Is customization possible in BI-Server CE latest version or 5.x.x version. If possible then it required source code? & how (using tool Eclipse)?

waiting for reply.....

Thanks in Advance


Thanks,
Anand

Pentaho Connectivity Issue with the SQL Server using Windows Authentication

$
0
0
Hi Everyone,

I am trying to connect to a My Sql Database via JDBC access from a Pentaho Package's Transformation file. I tried several available solutions available on net, i.e. by adding a sqljdbc_auth.dll file at several places. But it didn't work.

The error coming up is as follows:

Error connecting to database [database name] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database..
Login failed for 'particular user'..

Can any one help me in resolving this? It will be a great help..

Regards,
Chaitanya

Kettle-Spoon 5.3 with Java 8

$
0
0
Hi,
I have been using Kettle-Spoon 5.3 with Java 7 update67. The company plan to remove all Java 7 this Friday and install Java 8. Can I still use Kettle-Spoon 5.3 with Java8? Or do I have to upgrade Kettle to version 6 to use with Java 8?

Thank you.

Table Input from - Execute SQL from File

$
0
0
I have a series of SQL files I would like to run and insert into a MySQL data base.

I have the SQL Script runner set up to run from file - but I cannot figure out how to create a table using the results from the SQL Script.
(The SQL Script I'm running is a Select Statement)

I am on a Windows machine so I cannot use MySQL BulkLoader.

Is it possible to use a Table Input (or Table Output) to run SQL from File (as is possible in the SQL Script tool)? - If it is not how would I go about doing this?

Please Advise.

Error on Pentaho Map Reduce Step connecting to Cloudera

$
0
0
I am getting the following error trying to execute a job that invokes Pentaho MapReduce step; it seems that the property mapreduce.job.name has not been set by the step and must be included manually in a configuration file.

2015/12/01 16:39:01 - Pentaho MapReduce - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : The value of property mapreduce.job.name must not be null
2015/12/01 16:39:01 - Pentaho MapReduce - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : java.lang.IllegalArgumentException: The value of property mapreduce.job.name must not be null
2015/12/01 16:39:01 - Pentaho MapReduce - at com.google.common.base.Preconditions.checkArgument(Preconditions.java:92)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.apache.hadoop.conf.Configuration.set(Configuration.java:1048)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.apache.hadoop.conf.Configuration.set(Configuration.java:1029)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.apache.hadoop.mapred.JobConf.setJobName(JobConf.java:1443)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.apache.hadoop.mapreduce.Job.setJobName(Job.java:1023)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.hadoop.shim.cdh54.ConfigurationProxyV2.setJobName(ConfigurationProxyV2.java:61)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:525)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.di.job.Job.execute(Job.java:730)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.di.job.Job.execute(Job.java:873)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.di.job.Job.execute(Job.java:546)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.di.job.Job.run(Job.java:435)
2015/12/01 16:39:01 - weblog_parse_mr - Finished job entry [Pentaho MapReduce] (result=[false])

Kettle Crashes when I click either the connection Edit... or Add... buttons

$
0
0
I've recently upgraded to 6.0.0 running under Java 8 on a MacBook Pro, and every time I click the Edit or New buttons (in Spoon) to alter connection details. If I simply start spoon, create a new transformation file, add a table input control, the open it and click one of the aforementioned buttons, it crashes with the attached error.

Any ideas? Perhaps I need to provide more details about my environment?Thanks in advance
Attached Files

CE 6 on Ubuntu 14.04 with own PostgreSQL repository

$
0
0
Hello.

Can anyone send me tutorial how to configure CE6 on Ubuntu 14.04 to connect to repository on PostgreSQL or MySQL? I have searched but all tutorial are to earlier versions and after changing files CE 6 doesn't start.

how to write blob field to file

$
0
0
i will try write a blob to a file.the blob field store image data , i try blobsinkettle.zip ,but generate file is 0K.
Attached Files

Install R on Spoon - linux server

$
0
0
Hi All,
What is the best way to install R on PDI linux server?

Thanks.

Error connecting to SQL Server when running job in a batch, running fine from Spoon

$
0
0
Hi colleagues,

I have an issue running PDI job in batch though it's running perfectly from Spoon.
The error is:
Error connecting to database: (using class com.microsoft.sqlserver.jdbc.SQLServerDriver)
The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "RSA premaster secret error".

Obviously - I've added sqljdbc4.jar to libext\JDBC folder.

Please advise.
Thanks,
Sveta

JOIN using CONCAT

$
0
0
I need to reply this query:

Code:

SELECT *
FROM a
right outer join
b
ON b.field1 LIKE CONCAT(a.field2, '%')

I don't know how kettle is able to reply LIKE and CONCAT in a join.
Any idea?

Thanks!

Connection Type - Generic Database vs. Defined

$
0
0
Hi everyone

I need to get numbers from database(ORACLE) with decimal point set to ',' and space.
I've added parameters in spoon.bat %PENTAHO_DI_JAVA_OPTIONS% "-Duser.language=pl" "-Duser.region=PL"

And now, when i use connection type - Generic Database with custom URL it works fine and query returns comma in numbers, but when I use ORACLE connection type query still returns dot.
I've compared "Feature List" of these two connections and noticed any differences - both of them created same URL and driver class(jdbc:oracle:thin:@<IP_ADDRESS>:1521:<SID>, oracle.jdbc.driver.OracleDriver) other parameters also look the same.

Could You tell me what is going on ?

Column Value same throughout

$
0
0
Hi, I'm using PDI 5.2, while consolidating three .CSV files I'm facing a problem i.e there is a common column in all three files but the value is different and when I consolidate it, it shows me the same value throughout...in the file attached below, in output console the column "Exp_Name" consist of same value throughout ,but in source it has different values.....
the component I had used is TextInputFile--->Add Constant------>Calculator
please help me with this....
Attached Files

unable to maintain hierarchy in json output

$
0
0
Hello!
I am transforming xml to json in kettle. I have an xml file with parent child hierarchy and I want the same hierarchy to be maintained in the json output. I am using getDataFromXml and only the elements are getting populated in the field window. For eg, my structure is

<department>
<name>ADMINISTRATION</name>
<id>1</id>
<employee>
<EmpID>7956</EmpID>
</employee>
</department>
Here, name,id and EmpId are populated in json. The hierarchy Department-Employee isn't maintained in json output.
How can I have the parent-child structure maintained in the output?

Problems on logging on files using PDI 6.0

$
0
0
I have a set of jobs and transformations and using file based logging on execution.

When upgrading to PDI 6.0 looks like the logging stops writing on files and also on Spoon log tab. No messages on problems are shown.
I tried to increase the values related do logging on PDI but it does not make any difference:
KETTLE_MAX_JOB_ENTRIES_LOGGED 5000 The maximum number of job entry results kept in memory for logging purposes.
KETTLE_MAX_JOB_TRACKER_SIZE 5000 The maximum number of job trackers kept in memory
KETTLE_MAX_LOGGING_REGISTRY_SIZE 5000 The maximum number of logging registry entries kept in memory for logging purposes.
KETTLE_MAX_LOG_SIZE_IN_LINES 100000 The maximum number of log lines that are kept internally by Kettle. Set to 0 to keep all rows
(default)
KETTLE_MAX_LOG_TIMEOUT_IN_MINUTES 0 The maximum age (in minutes) of a log line while being kept internally by Kettle. Set to 0 to keep all
rows indefinitely (default)

Anyone can have any idea what could be the problem?

Thanks in advance.
Viewing all 16689 articles
Browse latest View live


Latest Images