Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Modify existing job to loop through multiple accounts for repeating process

$
0
0
Hi,

We currently have a Kettle/Spoon job that for one department accesses a web service to retrieve data and create a file on our side. Other departments will now need to use it so we will have multiple accounts needing to use the web service (get their corresponding data to repeat the process and create a file for each department). So far my searching has found some examples of looping but I don't think they are close to what we need.

Our new steps will need to do the following:

-Store the department names (I just started reading about storing parameters in Kettle)
-At the beginning of the process, assign the first name to a variable.
-Use the variable to modify the existing code for naming the file, selecting the right password, etc. (This should be straightforward but I wanted to include what we are doing).
-When the process is finished for first department name, return to the beginning to get the next one and repeat.

Thanks.

Schema Workbench Freezes after Connection Options.

$
0
0
Hi, I am new on the suite Pentaho. I'm trying to connect to schema Workbench Product Version : 3.11.1.0-386

-> Option -> Connections

I select General, Connection Type: AS/400, Access: Native (JDBC)

Connection Name: Test
With this settings:
Host Name: 10.2X.YY.ZZ (i give a correct Host Name)
Database Name: BDTEST (i give a correct Database Name)
Port Number: -1
User Name: SISEDO
Password: xxxx (I give a correct password)

WHen i click in Test: Everything Is OK.

When i click in OK and back to Schema Workbench it freezes...
Click on any option of menu doesn't work!!!!

I put the driver in the correct folder.

Thanks in advance for your help.

J.







Transformation Executor help

$
0
0
I am attempting to use a Transformation Executor step to do something I've never tried before (I am open to alternatives if there's a better way). This is also my first time using a Transformation Executor step too, so I am not sure I "get it".

I have a spreadsheet that I am trying to modify to create a new spreadsheet. In each row of the original, there is a ^ delimited string of record IDs that I want to transform into a series of html links, which use the IDs.

I believe I can do this using a Transformation Executor; Take the delimited string and pass it to a child transformation, then in that child transformation split the delimited string into rows (1 ID per row), concatenate together an html link using that ID, then join all the rows together back into a single field and pass it back to the parent transformation. The parent transformation then continues with this new field as well as the others from the original sheet, and writes a new sheet.

I've made a first attempt at doing this, but the new sheet gets written with just column headers and no data. So apparently the original stream isn't being passed along, nor is the results of the child transformation.

I don't know how to even debug this, given there's a child transformation where the data is getting lost.

Help?

expertise data integration

$
0
0
when data is rapidly increasing and coming in assorted forms, there is a growing need for a flexible, adaptable, resourceful and cost-effective systematic platform which will take minimum on-boarding time. Pentaho fits just perfect in this space with an established track record.

This Pentaho course online trains you on real-time integration of data and perform accurate and quick data analytic s .
https://goo.gl/oQFgwc

skillset required for ETL Tool

$
0
0
ETL is like an evergreen area. Most of the major bank in world use ETL tool. Other big companies, those are collecting data from different geographical location in world, can not survive without an ETL tool. https://goo.gl/7WYtsY
Though job requirements are less is market but the availability of resources/candidates are also less. so not much competition but its one of the best domain to choose or make career.

Windos Schedule and job pentaho : failed

$
0
0
Hi,
Icreate a jobpentahoand the.batfile thatexectuesuccessfullywhenI run itmanually butwhenI plan it withTaskScheduler window thelogfiledetectsResult= false.
can youhelp please,here isthe content of my.bat file:
D:\Pentaho\data-integration\kitchen.bat /file:D:\kettle\test.kjb /level:Basic > D:\temp\trans.log

Microsoft business intelligence

$
0
0
Microsoft Business Intelligence allows storing and retrieving data for smart processing and rapid decisions in business.

This Business Intelligence suite from Microsoft consists of resourceful tools providing best solutions for Business Intelligence.

The demand for SSAS, SSIS and SSRS Professionals is rising lately, and many reputed companies are looking for skilled business intelligent candidates.
https://goo.gl/hRlWUh

Warehouse Modeling

$
0
0
warehousing and data modeling. It covers following topics: Introduction Data modeling, Multidimensional Model, and Introduction to RDBMS & Database Vs Data warehouse, Dimensional Modeling, Cube, and Erwin. The ERwin training further includes Need of Data Model, Multidimensional Model, SCD Implementation Models, RDBMS, SQL Parsing and other such topics. This course is designed for beginners to advance level professionals.
https://goo.gl/QhJIJA

Pig script executor not working

$
0
0
Hi all,
I try to run the officiel exmaple for 'pig script executor' provided by the wiki.pentaho but it seems to be not working for me because it can not submit the job to hadoop cluster.

Here is the PDI log message I get :

Code:

2016/04/29 12:12:28 - Pentaho Data Integration - Démarrage tâche ...
2016/04/29 12:12:43 - Pig_script_executor - Démarrage tâche
2016/04/29 12:12:43 - Pig_script_executor - Démarrage exécution entrée [Pig Script Executor]
2016/04/29 12:12:43 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/04/29 12:12:43 - Pig_script_executor - Fin exécution  entrée tâche [Pig Script Executor] (résultat=[true])
2016/04/29 12:12:43 - Pig_script_executor - Fin exécution tâche
2016/04/29 12:12:43 - Pentaho Data Integration - L'exécution de la tâche a été achevée.
2016/04/29 12:12:43 - Pig Script Executor - Pig Script Executor in Pig_script_executor has been started asynchronously. Pig_script_executor has been finished and logs from Pig Script Executor can be lost
2016/04/29 12:12:44 - Pig Script Executor - 2016/04/29 12:12:44 - Connecting to hadoop file system at: hdfs://sigma-server:54310
2016/04/29 12:12:45 - Pig Script Executor - 2016/04/29 12:12:45 - Connecting to map-reduce job tracker at: sigma-server:8032
2016/04/29 12:12:45 - Pig Script Executor - 2016/04/29 12:12:45 - Empty string specified for jar path
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - Pig features used in the script: GROUP_BY
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, DuplicateForEachColumnRewrite, FilterLogicExpressionSimplifier, GroupByConstParallelSetter, ImplicitSplitInserter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, NewPartitionFilterOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter], RULES_DISABLED=[PartitionFilterOptimizer]}
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - File concatenation threshold: 100 optimistic? false
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - Choosing to move algebraic foreach to combiner
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - MR plan size before optimization: 1
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - MR plan size after optimization: 1
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - Pig script settings are added to the job
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - Reduce phase detected, estimating # of required reducers.
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - Using reducer estimator: org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.InputSizeReducerEstimator
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - BytesPerReducer=1000000000 maxReducers=999 totalInputFileSize=81468050
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - Setting Parallelism to 1
2016/04/29 12:12:46 - Pig Script Executor - 2016/04/29 12:12:46 - creating jar file Job3343392863197581306.jar
2016/04/29 12:12:49 - Pig Script Executor - 2016/04/29 12:12:49 - jar file Job3343392863197581306.jar created
2016/04/29 12:12:49 - Pig Script Executor - 2016/04/29 12:12:49 - Setting up single store job
2016/04/29 12:12:49 - Pig Script Executor - 2016/04/29 12:12:49 - Key [pig.schematuple] is false, will not generate code.
2016/04/29 12:12:49 - Pig Script Executor - 2016/04/29 12:12:49 - Starting process to move generated code to distributed cache
2016/04/29 12:12:49 - Pig Script Executor - 2016/04/29 12:12:49 - Setting key [pig.schematuple.classes] with classes to deserialize []
2016/04/29 12:12:49 - Pig Script Executor - 2016/04/29 12:12:49 - 1 map-reduce job(s) waiting for submission.


Can anyone help be on this ?

Thanks i advance

cannot connect to Amazon Redshift with Pentaho CE

$
0
0
Hi

I have tried with Pentaho CE 6.1 and 5.3

I have tried with both drivers found here:
http://docs.aws.amazon.com/redshift/...ad-jdbc-driver

Copied these to lib folder and Redshift pops up as one of the connections to be selected.

However after I have configured it and test it
1) JDBC 4.0–compatible driver gives me error "[Amazon](100021) Error setting default driver property values."
2) JDBC 4.1–compatible driver gives me error "Driver class 'com.amazon.redshift.jdbc4.Driver' could not be found, make sure the 'Redshift' driver (jar file) is installed.com.amazon.redshift.jdbc4.Driver"

I read here that people have successfully connected to Redshift - can somebody give exact procedure how they configured it.


Best regards
Kristjan

Problem with charts in Pentaho Anlayzer report

$
0
0
Hi every one,
I'm using Pentaho Enterprise Edition latest version(6.0) for creating some analyzer reports.I need to show the data into the chart and data labels should come with "LABEL NAME" instead of value like the image attached.I tried many ways but i could able to get,even i tried with MDX Expressions how ever that is also not meeting the expectations, and how i can get individual legends for each chart when i use multi-pie option in the chart option

Thanks in Advance
Attached Images

Updating single table in parallel issue

$
0
0
I have a transformation that does the following:

Read distinct set of data from Table A
Obtains a look up value for each record from Table B
Updates Table A with record from Table B

I'm running the update with 10 copies to speed things up. Occasionally, I get a deadlock error on the update. The result set should be distinct so the update should not be hitting the same record. It is not a DB lock.

The initial select also uses the following statement to run in parallel.
WHERE MOD(UNIQUE_ID,${Internal.Step.Unique.Count}) = ${Internal.Step.Unique.Number}

Is a parallel update possible, or is kettle doing something behind the scenes that I'm not aware of?

ETA: It does not occur on every execution, some runs it is no issue, some runs deadlock.

TIA

Clearly defined database interface - looking for advice

$
0
0
Hi,

I am relatively new to PDI/ETL. I am also new to a team of developers who are using a (long-existing) .NET application in combination with PDI. A simplified version of the process looks like this:

Via a web interface, a PDI job is triggered. This job launches the .NET application which performs certain operations and writes the results into a database shared among all components of the system. Afterwards, the ETL process uses the exported data and transforms it further, and so on.

Within this construction, we have certain problems due to a poorly defined interface between the .NET application and the ETL process. Whenever a developer changes a database table, he or she cannot easily foresee whether this will break the ETL process down the road. Due to the complexity of the system, such a mistake might only become visible hours later which is a huge problem for the workflow of the team.

I would be happy if someone could drop me a hint with regard to some kind of good-practice solution to this kind of challenge. Is it possible to, e.g., instead of accessing database tables and columns directly, define a data model beforehand and only access db fields through this model? I am probably explaining this poorly. I am thinking of a solution which is equivalent to the practice of not writing a hard-coded string in 100 different places within the source code of an application but encapsulate the information instead.

Script execution in Kettle job fails

$
0
0
After upgrading to PDI 6.1 from 6.0, I am having an issue with running PowerShell scripts from the "Script" module in a Kettle job.

No matter how I try to start the job (directly via "Insert Script", by calling PowerShell.exe with args, or from a batch file), it always gives me the same error. I've also tried removing the bulk of the script to identify the failure, but the "New-WebServiceProxy" command is at the beginning of the (now very short) script.

The error seems to be common to Java programs, and seems to be related to build-time environment variables and a hard limit in Java relating to the Program Environment Block (PEB) on Windows systems.

Code:

2016/04/29 12:53:43 - Run Powershell - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : (stderr) New-WebServiceProxy : The environment block used to start a process cannot be longer than 65535 bytes.  Your
2016/04/29 12:53:43 - Run Powershell - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : (stderr) environment block is 75416 bytes long.  Remove some environment variables and try again.

I have tried this on several systems (Win7, Win10, and WinServer2012-R2), all with the same results. Both my modified and unmodified scripts run as expected on all systems using PDI 6.0.

Any ideas on how to fix or circumvent this?
Thanks!

btable refresh in runtime

$
0
0
Hello, I have a btable component to which you pass parameters (dimensions, filters and metrics) in pre-execution,
these parameters are changing according to user choice at runtime,
I can not get is that the component is refreshed with new parameters,
someone would be so kind as to give me a clue ?.


Thank you

default user account

$
0
0
how canweremovethe defaultuseraccount from thepageauthentifcation (console Admin)
thanks

Parallel transformations as part of pdi job fails ..

$
0
0
sorry new to forum and pdi development. i have issue running parallel transformations in a job.
i created a pdi job(kjb) which has 5 transformations that insert data to target table B from source table A that run parallel. for small data the job is fine.
however , when i need to insert millions of rows i am running into out of memory issues.

But, when i call the same transformations through a shell script as below, i am able to insert upto 300 million rows parallel. no issues at all.

#main#

./pan.sh -file=../jobs/instance1.ktr -logfile=./instance1.log &
./pan.sh -file=../jobs/instance2.ktr -logfile=./instance2.log &
./pan.sh -file=../jobs/instance3.ktr -logfile=./instance3.log &
./pan.sh -file=../jobs/instance4.ktr -logfile=./instance4.log &
./pan.sh -file=../jobs/instance5.ktr -logfile=./instance5.log &

i am not sure why, a pdi job runs out of memory that runs parallel transformation fails, but when these run thru .sh , it is fine.
I really want to run through a ktr job not thru .sh script. Please do let me know where i am going wrong? if memory was a issue, should'nt both pdi job and .sh script
fail? i am puzzled, sorry i am new to pdi. please suggest what could be the reason, and what should i do to work through pdi?

HL7 input Pentaho kettle- repeating segments in HL7 message

$
0
0
I want to know how to deal with this type of repeated segments hl7, in pentaho kettle. I want to extract each information from each segment.It's urgent, Thank you.

DG1|1||J44.1^CHRONIC OBSTRUCTIVE PULMONARY DISEASE WITH (ACUTE) EXACERBATION||20160224|W|||||||||||||||U
DG1|2||J90^PLEURAL EFFUSION, NOT ELSEWHERE CLASSIFIED||20160224|W|||||||||||||||U
DG1|3||I10^ESSENTIAL (PRIMARY) HYPERTENSION||20160224|W|||||||||||||||U
DG1|4||I50.30^UNSPECIFIED DIASTOLIC (CONGESTIVE) HEART FAILURE||20160224|W|||||||||||||||U
DG1|5||E11.9^TYPE 2 DIABETES MELLITUS WITHOUT COMPLICATIONS||20160224|W|||||||||||||||U
DG1|6||I73.9^PERIPHERAL VASCULAR DISEASE, UNSPECIFIED||20160224|W|||||||||||||||U
DG1|7||M15.9^POLYOSTEOARTHRITIS, UNSPECIFIED||20160224|W|||||||||||||||U
DG1|8||E78.5^HYPERLIPIDEMIA, UNSPECIFIED||20160224|W|||||||||||||||U
DG1|9||G47.33^OBSTRUCTIVE SLEEP APNEA (ADULT) (PEDIATRIC)||20160224|W|||||||||||||||U
DG1|10||I27.2^OTHER SECONDARY PULMONARY HYPERTENSION||20160224|W|||||||||||||||U
DG1|11||M62.81^MUSCLE WEAKNESS (GENERALIZED)||20160224|W|||||||||||||||U
DG1|12||R26.2^DIFFICULTY IN WALKING, NOT ELSEWHERE CLASSIFIED||20160224|W|||||||||||||||U

Unable to Create Repository

$
0
0
Hi there, I'm running Kettle 6.1.0.1 on Mac OSX.

I've set up the generic Database connection to postgresql and it connects OK. When I select Create / Upgrade the repository connection I get the following error. I did figure out the at admin password is admin.
Any advice most welcome,
Cheers,
Kirk

java.lang.reflect.InvocationTargetException: Error creating or upgrading repository:


Unable to insert new version log record into R_VERSION


Couldn't execute SQL: INSERT INTO R_VERSION VALUES(?, ?, ?, ?, ?)


ERROR: column "upgrade_date" is of type timestamp without time zone but expression is of type boolean
Hint: You will need to rewrite or cast the expression.
Position: 46




at org.pentaho.di.ui.repository.dialog.UpgradeRepositoryProgressDialog$1.run(UpgradeRepositoryProgressDialog.java:106)
at org.eclipse.jface.operation.ModalContext.runInCurrentThread(ModalContext.java:369)
at org.eclipse.jface.operation.ModalContext.run(ModalContext.java:313)
at org.eclipse.jface.dialogs.ProgressMonitorDialog.run(ProgressMonitorDialog.java:495)
at org.pentaho.di.ui.repository.dialog.UpgradeRepositoryProgressDialog.open(UpgradeRepositoryProgressDialog.java:114)
at org.pentaho.di.ui.repository.kdr.KettleDatabaseRepositoryDialog.create(KettleDatabaseRepositoryDialog.java:533)
at org.pentaho.di.ui.repository.kdr.KettleDatabaseRepositoryDialog.access$500(KettleDatabaseRepositoryDialog.java:75)
at org.pentaho.di.ui.repository.kdr.KettleDatabaseRepositoryDialog$5.handleEvent(KettleDatabaseRepositoryDialog.java:287)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.notifyListeners(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.repository.kdr.KettleDatabaseRepositoryDialog.open(KettleDatabaseRepositoryDialog.java:324)
at org.pentaho.di.ui.repository.kdr.KettleDatabaseRepositoryDialog.open(KettleDatabaseRepositoryDialog.java:75)
at org.pentaho.di.ui.repository.RepositoriesHelper.editRepository(RepositoriesHelper.java:155)
at org.pentaho.di.ui.repository.controllers.RepositoriesController.editRepository(RepositoriesController.java:284)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.swt.tags.SwtButton.access$300(SwtButton.java:43)
at org.pentaho.ui.xul.swt.tags.SwtButton$2.mouseUp(SwtButton.java:103)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.notifyListeners(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.di.ui.xul.KettleDialog.show(KettleDialog.java:88)
at org.pentaho.di.ui.xul.KettleDialog.show(KettleDialog.java:55)
at org.pentaho.di.ui.repository.controllers.RepositoriesController.show(RepositoriesController.java:196)
at org.pentaho.di.ui.repository.RepositoriesDialog.show(RepositoriesDialog.java:90)
at org.pentaho.di.ui.spoon.Spoon.openRepository(Spoon.java:3759)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:43)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:106)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.notifyListeners(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1347)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7989)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9269)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:662)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleException:
Unable to insert new version log record into R_VERSION


Couldn't execute SQL: INSERT INTO R_VERSION VALUES(?, ?, ?, ?, ?)


ERROR: column "upgrade_date" is of type timestamp without time zone but expression is of type boolean
Hint: You will need to rewrite or cast the expression.
Position: 46




at org.pentaho.di.repository.kdr.KettleDatabaseRepositoryCreationHelper.createRepositorySchema(KettleDatabaseRepositoryCreationHelper.java:236)
at org.pentaho.di.repository.kdr.KettleDatabaseRepository.createRepositorySchema(KettleDatabaseRepository.java:1640)
at org.pentaho.di.ui.repository.dialog.UpgradeRepositoryProgressDialog$1.run(UpgradeRepositoryProgressDialog.java:99)
... 74 more
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Couldn't execute SQL: INSERT INTO R_VERSION VALUES(?, ?, ?, ?, ?)


ERROR: column "upgrade_date" is of type timestamp without time zone but expression is of type boolean
Hint: You will need to rewrite or cast the expression.
Position: 46


at org.pentaho.di.core.database.Database.execStatement(Database.java:1513)
at org.pentaho.di.repository.kdr.KettleDatabaseRepositoryCreationHelper.createRepositorySchema(KettleDatabaseRepositoryCreationHelper.java:231)
... 76 more
Caused by: org.postgresql.util.PSQLException: ERROR: column "upgrade_date" is of type timestamp without time zone but expression is of type boolean
Hint: You will need to rewrite or cast the expression.
Position: 46
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2198)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1927)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:561)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:419)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:412)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1480)
... 77 more

Kettle + Write Log when SQL query faild

$
0
0
Hello everybody, i Have a problem and would like to find someone can help me:

I'm creating a dynamc query that read from an xls file, but when i execute the query (select) in some rows on the xls are cells that make my SELECT FAIL.

How can i Write a Log or GET the ERROR to show or save on another process these error?

Thanks , i need these because these is a task on my role and i can't make it works
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>