December 1, 2015, 3:26 am
Hi
I'm trying pentaho analyer (entreprise evaluation)
I added calculated measures in my analysis report. After having modified my model ( I added a field), all calculated fields are missing. The message is : some fields in this report have been made unavailable by administrator....etc.
What shall i do or that ?
Thank's in advance for any reply
Eric.
↧
December 1, 2015, 3:46 am
Hi,
I am new to Pentaho..
I have a requirement to check if a particular date coming from source file is Valid or not and If it is in valid put default date as 1900-01-01.
E.g - If below are Dates in source file.
1) 2014-01-01
2) 2015-25-25
3) 2013-01-34
Output
1) 2014-01-01
2) 1900-01-01 ---- replaced with default date
3) 1900-01-01 ---- replaced with default date
Thanks in advance!!
-Santosh
↧
↧
December 1, 2015, 3:58 am
Hi,
I must read a text file with fixed length fields in it, and there are a bunch of decimal fields conforming to the following pattern:
- 1st char: plus or minus sign.
- 2nd - 4th char: integer part of the number
- 5th char: decimal point
- 6th - 8th char: decimal part of the number
Examples:
+100.000
-023.240
+000.000
+001.000
I cannot get the number recognized by the "Fixed Input" step.
Is there any solution without adding another step?
The only way I found to read the number is to set the field type to String and then try to process it in other step after this one.
Which would be the best solution for transforming the string field into the correct number field (take into account there are 30+ fields of this type in the record)?
↧
December 1, 2015, 4:55 am
Hi All,
:(
I'm using Pentaho BI-Server ce-6.0.0.0-353.
in that, I want to implement Queuing Mechanism for execution of reports.
Reports will execute when the resources are free on Database Server
Can anyone tell me is it possible in Pentaho :confused: or need to use other tool for the same:rolleyes:??
thanks in advance.:)
Reg,
Anand
↧
December 1, 2015, 4:57 am
PDI works with Oracle Advanced Security (OAS)? It will be used instead of JNDI connection.
I'm searching about it but i don't find anything.
Thanks.
↧
↧
December 1, 2015, 5:08 am
Hi All,
I'm using Pentaho BI-Server ce 5.4 and latest version 6.0.0.0-353.
in that, I want to implement/customize scheduling pop-up in which given an option to choose the mode of delivery of reports-email or FTP
and also check any mail which is over 5 MB will need to be delivered on an FTP mode only
Is customization possible in BI-Server CE latest version or 5.x.x version. If possible then it required source code? & how (using tool Eclipse)?
waiting for reply.....
Thanks in Advance
Thanks,
Anand
↧
December 1, 2015, 5:33 am
Hi Everyone,
I am trying to connect to a My Sql Database via JDBC access from a Pentaho Package's Transformation file. I tried several available solutions available on net, i.e. by adding a sqljdbc_auth.dll file at several places. But it didn't work.
The error coming up is as follows:
Error connecting to database [database name] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database..
Login failed for 'particular user'..
Can any one help me in resolving this? It will be a great help..
Regards,
Chaitanya
↧
December 1, 2015, 10:08 am
Hi,
I have been using Kettle-Spoon 5.3 with Java 7 update67. The company plan to remove all Java 7 this Friday and install Java 8. Can I still use Kettle-Spoon 5.3 with Java8? Or do I have to upgrade Kettle to version 6 to use with Java 8?
Thank you.
↧
December 1, 2015, 10:49 am
I have a series of SQL files I would like to run and insert into a MySQL data base.
I have the SQL Script runner set up to run from file - but I cannot figure out how to create a table using the results from the SQL Script.
(The SQL Script I'm running is a Select Statement)
I am on a Windows machine so I cannot use MySQL BulkLoader.
Is it possible to use a Table Input (or Table Output) to run SQL from File (as is possible in the SQL Script tool)? - If it is not how would I go about doing this?
Please Advise.
↧
↧
December 1, 2015, 2:07 pm
I am getting the following error trying to execute a job that invokes Pentaho MapReduce step; it seems that the property mapreduce.job.name has not been set by the step and must be included manually in a configuration file.
2015/12/01 16:39:01 - Pentaho MapReduce - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : The value of property mapreduce.job.name must not be null
2015/12/01 16:39:01 - Pentaho MapReduce - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : java.lang.IllegalArgumentException: The value of property mapreduce.job.name must not be null
2015/12/01 16:39:01 - Pentaho MapReduce - at com.google.common.base.Preconditions.checkArgument(Preconditions.java:92)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.apache.hadoop.conf.Configuration.set(Configuration.java:1048)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.apache.hadoop.conf.Configuration.set(Configuration.java:1029)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.apache.hadoop.mapred.JobConf.setJobName(JobConf.java:1443)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.apache.hadoop.mapreduce.Job.setJobName(Job.java:1023)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.hadoop.shim.cdh54.ConfigurationProxyV2.setJobName(ConfigurationProxyV2.java:61)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:525)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.di.job.Job.execute(Job.java:730)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.di.job.Job.execute(Job.java:873)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.di.job.Job.execute(Job.java:546)
2015/12/01 16:39:01 - Pentaho MapReduce - at org.pentaho.di.job.Job.run(Job.java:435)
2015/12/01 16:39:01 - weblog_parse_mr - Finished job entry [Pentaho MapReduce] (result=[false])
↧
December 1, 2015, 3:00 pm
I've recently upgraded to 6.0.0 running under Java 8 on a MacBook Pro, and every time I click the Edit or New buttons (in Spoon) to alter connection details. If I simply start spoon, create a new transformation file, add a table input control, the open it and click one of the aforementioned buttons, it crashes with the attached error.
Any ideas? Perhaps I need to provide more details about my environment?Thanks in advance
↧
December 1, 2015, 3:22 pm
Hello.
Can anyone send me tutorial how to configure CE6 on Ubuntu 14.04 to connect to repository on PostgreSQL or MySQL? I have searched but all tutorial are to earlier versions and after changing files CE 6 doesn't start.
↧
December 1, 2015, 11:39 pm
i will try write a blob to a file.the blob field store image data , i try blobsinkettle.zip ,but generate file is 0K.
↧
↧
December 2, 2015, 1:05 am
Hi All,
What is the best way to install R on PDI linux server?
Thanks.
↧
December 2, 2015, 1:14 am
Hi colleagues,
I have an issue running PDI job in batch though it's running perfectly from Spoon.
The error is:
Error connecting to database: (using class com.microsoft.sqlserver.jdbc.SQLServerDriver)
The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "RSA premaster secret error".
Obviously - I've added sqljdbc4.jar to libext\JDBC folder.
Please advise.
Thanks,
Sveta
↧
December 2, 2015, 2:45 am
I need to reply this query:
Code:
SELECT *
FROM a
right outer join
b
ON b.field1 LIKE CONCAT(a.field2, '%')
I don't know how kettle is able to reply LIKE and CONCAT in a join.
Any idea?
Thanks!
↧
December 2, 2015, 3:04 am
Hi everyone
I need to get numbers from database(ORACLE) with decimal point set to ',' and space.
I've added parameters in spoon.bat %PENTAHO_DI_JAVA_OPTIONS% "-Duser.language=pl" "-Duser.region=PL"
And now, when i use connection type - Generic Database with custom URL it works fine and query returns comma in numbers, but when I use ORACLE connection type query still returns dot.
I've compared "Feature List" of these two connections and noticed any differences - both of them created same URL and driver class(jdbc:oracle:thin:@<IP_ADDRESS>:1521:<SID>, oracle.jdbc.driver.OracleDriver) other parameters also look the same.
Could You tell me what is going on ?
↧
↧
December 2, 2015, 3:43 am
Hi, I'm using PDI 5.2, while consolidating three .CSV files I'm facing a problem i.e there is a common column in all three files but the value is different and when I consolidate it, it shows me the same value throughout...in the file attached below, in output console the column "Exp_Name" consist of same value throughout ,but in source it has different values.....
the component I had used is TextInputFile--->Add Constant------>Calculator
please help me with this....
↧
December 2, 2015, 3:44 am
Hello!
I am transforming xml to json in kettle. I have an xml file with parent child hierarchy and I want the same hierarchy to be maintained in the json output. I am using getDataFromXml and only the elements are getting populated in the field window. For eg, my structure is
<department>
<name>ADMINISTRATION</name>
<id>1</id>
<employee>
<EmpID>7956</EmpID>
</employee>
</department>
Here, name,id and EmpId are populated in json. The hierarchy Department-Employee isn't maintained in json output.
How can I have the parent-child structure maintained in the output?
↧
December 2, 2015, 7:42 am
I have a set of jobs and transformations and using file based logging on execution.
When upgrading to PDI 6.0 looks like the logging stops writing on files and also on Spoon log tab. No messages on problems are shown.
I tried to increase the values related do logging on PDI but it does not make any difference:
KETTLE_MAX_JOB_ENTRIES_LOGGED 5000 The maximum number of job entry results kept in memory for logging purposes.
KETTLE_MAX_JOB_TRACKER_SIZE 5000 The maximum number of job trackers kept in memory
KETTLE_MAX_LOGGING_REGISTRY_SIZE 5000 The maximum number of logging registry entries kept in memory for logging purposes.
KETTLE_MAX_LOG_SIZE_IN_LINES 100000 The maximum number of log lines that are kept internally by Kettle. Set to 0 to keep all rows
(default)
KETTLE_MAX_LOG_TIMEOUT_IN_MINUTES 0 The maximum age (in minutes) of a log line while being kept internally by Kettle. Set to 0 to keep all
rows indefinitely (default)
Anyone can have any idea what could be the problem?
Thanks in advance.
↧