Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Problema con Cambio horario

$
0
0
Hola a todos: Soy nueva en el foro. Después de investigar mucho por la red he encontrado este foro y espero que me soluciones un problema que en este momento me tiene parada en mi trabajo.

Al cargar datos desde un fichero excel con formato dd/mm/yyyy hh:mm:ss el pentaho debido al cambio de hora en verano/invierno me altera los datos.
En este caso me suma una hora. Solo cuando el pentaho encuentra datos afectados por la franja horaria en el dia del cambio.
Desearia saber si este problema se solucionaría desde la configuracion del propio programa (zona horaria).
¿Como hago para que se adapte al cambio de horario y no me pase dentro de seis meses lo mismo?

Muchas gracias de antemano. Un saludo.

Question about embedding Pentaho in a Java app

$
0
0
Hi,
I have a custom .prpt (let's call it custom.prpt) and it works fine; however, my team is looking for a way to run the report automatically.
I am new to Pentaho and am looking for a way to achieve this.
I did some research and was able to get "org.pentaho.reporting.engine.classic.samples.Sample1.java" to work.
However, when I replaced "Sample1.prpt" with "custom.prpt" in "Sample1.java" (see the method below), I got errors left and right and "custom.prpt" is in the same directory as "Sample1.prpt". Am I even on the right track? I am sure there is more work to be done in order to make this work; however, I am not sure how to solve this problem.

public MasterReport getReportDefinition()
{
try
{
// Using the classloader, get the URL to the reportDefinition file
final ClassLoader classloader = this.getClass().getClassLoader();
final URL reportDefinitionURL = classloader.getResource("org/pentaho/reporting/engine/classic/samples/Sample1.prpt");


// Parse the report file
final ResourceManager resourceManager = new ResourceManager();
final Resource directly = resourceManager.createDirectly(reportDefinitionURL, MasterReport.class);
return (MasterReport) directly.getResource();
}
catch (ResourceException e)
{
e.printStackTrace();
}
return null;
}

please advise,

Neil

Pentaho Deployment Conflicting with Group Policy Client

$
0
0
Greetings,

We have a custom Deployment of Pentaho using PostgreSQL and Tomcat Apache running within the current version of our proprietary Medical Imaging software (RamSoft). The integration works well, but we have spent months struggling to identify the cause of a major conflict between the PostgreSQL/Tom integration and group policy client. Whenever the PostgreSQL or Tomcat Apache services are running, we begin to see 1 hour + reboot times and gpupdate failures due to the group policy client just hanging for long periods of time with no explanation. Simply disabling PostgreSQL/Tomcat resolves the issue, and allows gpclient to do what it needs to do.

We have enabled all known debugging in Group Policy, PostgreSQL, Pentaho, and Tomcat, performed xBootMgr traces, performed Process Monitor analysis, and Packet Captures,Tomcat_BA Errors.txtTomcat_BA Errors.txt but have been unable to determine the cause of the conflict. We are also working with Microsoft, Apache, and PostgreSQL independently to try and flush out the culprit. After spending weeks analyzing and reviewing our development team's internal notes, I have become fairly confident that the root cause of this problem is related to the way that we deployed Tomcat, and the way that Tomcat/PostgreSQL communicate with each other, but I have not found solid proof that actually indicates this yet.

I have learned a lot about how PostgreSQL/Tomcat are functioning in this environment over the last week, but I am not part of the team that deployed this, and am certainly not an expert on Pentaho, PostgreSQL, or Tomcat. I have been collecting a list of debug error/warnings from the Tomcat logs over the last few days (attached), and I am hoping someone who is an expert on this stuff can possibly review this list of errors, provide an explanation/priority for each, and answer the following questions:

1. Are there any known conflicts with the PostgreSQL/Tomcat integration and GroupPolicy in Windows domain environments? Required Configurations? Workarounds?
2. Does the 'standard' Pentaho Deployment utilize two instances of Tomcat6.exe (one for the DI Service, and one for the BA Service?), or is this a deployment customization that was done by our development team?
3. Are there any special debugging options or monitoring tools that we could use to get more information about what PostgreSQL/Tomcat are doing during the time periods that Group Policy Client is hung?
4. Do you have any suggestions or options that we can try to see if our behavior changes?

Please let me know if there is any additional information I can provide to help.

Nick
Attached Files

Pentaho 5.4.0 with MapR 4.1.0 - creating shims

$
0
0
Hi All,

I have a requirement of using Pentaho 5.4.0 with MapR 4.1.0, but this version of Pentaho supports only MapR 4.0.1, Is there a way I can create Shims for the same.



Regards,
Vasu C

How to use Kitchen

$
0
0
Dear All,
I have read http://wiki.pentaho.com/display/EAI/...0Documentation this wiki about Kitchen . but when i use Kitchen like this "Kitchen.bat /req:kettleuser /user:admin /pass:admin /dir:/ /job:p /level:Basic ", it tell me: ERROR: Kitchen can't continue because the job couldn't be loaded.

who can help me solve this issue,Thx!

Import & export JSON message from blob in Hbase colum

$
0
0
Hi,

I'm having JSON messages stored as blobs in a single column in Hbase. How can I extract the JSON messages?

I've been able to use the Hbase input connector to get the records out of my Hbase table, but so far been unable to convert the blob column into a JSON format. As far as I can tell the JSON input type doesn't accept a Hbase column as input.

Relatively new to Kettle so any help is appreciated.

Many thanks in advance.
Kind regards,
Martijn

Split transformation to Success or Failure depends on query result

$
0
0
Hi,
I have a query i am running through 'Execute SQL Statement' how can i make the transformation fail depends on the result of the query?
I would really appreciate if someone could reply quickly since its very urgent.
Thanks,
Ofer

Ignore rows with foreign key issues

$
0
0
We are considering the use of Pentaho Data Integration in our project to align 2 different databases and keep them aligned via a scheduled job. In the destination database suppose we have 2 tables (table A and table B). Table B has a foreign key constraint pointing to table A. I try to load all the rows for table A and check if the row is suitable for the destination table. If not, I simply discard it from the stream. Now, if one or more rows loaded for destination table B contain a reference to previously discarded rows, they should simply be ignored. So far, I get a foreign key error. The long solution should be to check if parent rows exist in table A, but in some cases we have more then 10 constraints, so this is not performant.
How can we procede?
Thank you!

Unable to start blueprint container for bundle pentaho-requirejs-osgi-manager

$
0
0
I just installed Pentaho 6 and I'm getting the following error in my pentaho.log. My PENTAHO_JAVA is jdk1.8.0_65 and I am running on Red Hat 6.6.

Code:

2015-11-06 12:35:54,848 ERROR [org.apache.aries.blueprint.container.BlueprintContainerImpl] Unable to start blueprint container for bundle pentaho-requirejs-osgi-managerorg.osgi.service.blueprint.container.ComponentDefinitionException: Unable to initialize bean configManager
        at org.apache.aries.blueprint.container.BeanRecipe.runBeanProcInit(BeanRecipe.java:714)
        at org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:824)
        at org.apache.aries.blueprint.container.BeanRecipe.internalCreate(BeanRecipe.java:787)
        at org.apache.aries.blueprint.di.AbstractRecipe$1.call(AbstractRecipe.java:79)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at org.apache.aries.blueprint.di.AbstractRecipe.create(AbstractRecipe.java:88)
        at org.apache.aries.blueprint.container.BlueprintRepository.createInstances(BlueprintRepository.java:245)
        at org.apache.aries.blueprint.container.BlueprintRepository.createAll(BlueprintRepository.java:183)
        at org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:682)
        at org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:377)
        at org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:269)
        at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:294)
        at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:263)
        at org.apache.aries.blueprint.container.BlueprintExtender.modifiedBundle(BlueprintExtender.java:253)
        at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.customizerModified(BundleHookBundleTracker.java:500)
        at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.customizerModified(BundleHookBundleTracker.java:433)
        at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$AbstractTracked.track(BundleHookBundleTracker.java:725)
        at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.bundleChanged(BundleHookBundleTracker.java:463)
        at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$BundleEventHook.event(BundleHookBundleTracker.java:422)
        at org.apache.felix.framework.util.SecureAction.invokeBundleEventHook(SecureAction.java:1103)
        at org.apache.felix.framework.util.EventDispatcher.createWhitelistFromHooks(EventDispatcher.java:695)
        at org.apache.felix.framework.util.EventDispatcher.fireBundleEvent(EventDispatcher.java:483)
        at org.apache.felix.framework.Felix.fireBundleEvent(Felix.java:4403)
        at org.apache.felix.framework.Felix.startBundle(Felix.java:2092)
        at org.apache.felix.framework.Felix.setActiveStartLevel(Felix.java:1291)
        at org.apache.felix.framework.FrameworkStartLevelImpl.run(FrameworkStartLevelImpl.java:304)
        at java.lang.Thread.run(Thread.java:745)
Caused by: Unexpected character (s) at position 0.
        at org.json.simple.parser.Yylex.yylex(Yylex.java:610)
        at org.json.simple.parser.JSONParser.nextToken(JSONParser.java:269)
        at org.json.simple.parser.JSONParser.parse(JSONParser.java:118)
        at org.json.simple.parser.JSONParser.parse(JSONParser.java:92)
        at org.pentaho.js.require.RequireJsConfigManager.loadJsonObject(RequireJsConfigManager.java:135)
        at org.pentaho.js.require.RequireJsConfigManager.updateBundleContext(RequireJsConfigManager.java:82)
        at org.pentaho.js.require.RequireJsConfigManager.init(RequireJsConfigManager.java:218)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.aries.blueprint.utils.ReflectionUtils.invoke(ReflectionUtils.java:297)
        at org.apache.aries.blueprint.container.BeanRecipe.invoke(BeanRecipe.java:958)
        at org.apache.aries.blueprint.container.BeanRecipe.runBeanProcInit(BeanRecipe.java:712)
        ... 26 more
2015-11-06 12:35:54,880 ERROR [org.apache.aries.blueprint.container.BlueprintContainerImpl] Unable to start blueprint container for bundle pentaho-requirejs-osgi-manager
org.osgi.service.blueprint.container.ComponentDefinitionException: Unable to initialize bean configManager
        at org.apache.aries.blueprint.container.BeanRecipe.runBeanProcInit(BeanRecipe.java:714)
        at org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:824)
        at org.apache.aries.blueprint.container.BeanRecipe.internalCreate(BeanRecipe.java:787)
        at org.apache.aries.blueprint.di.AbstractRecipe$1.call(AbstractRecipe.java:79)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at org.apache.aries.blueprint.di.AbstractRecipe.create(AbstractRecipe.java:88)
        at org.apache.aries.blueprint.container.BlueprintRepository.createInstances(BlueprintRepository.java:245)
        at org.apache.aries.blueprint.container.BlueprintRepository.createAll(BlueprintRepository.java:183)
        at org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:682)
        at org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:377)
        at org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:269)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at org.apache.aries.blueprint.container.ExecutorServiceWrapper.run(ExecutorServiceWrapper.java:106)
        at org.apache.aries.blueprint.utils.threading.impl.DiscardableRunnable.run(DiscardableRunnable.java:48)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: Unexpected character (s) at position 0.
        at org.json.simple.parser.Yylex.yylex(Yylex.java:610)
        at org.json.simple.parser.JSONParser.nextToken(JSONParser.java:269)
        at org.json.simple.parser.JSONParser.parse(JSONParser.java:118)
        at org.json.simple.parser.JSONParser.parse(JSONParser.java:92)
        at org.pentaho.js.require.RequireJsConfigManager.loadJsonObject(RequireJsConfigManager.java:135)
        at org.pentaho.js.require.RequireJsConfigManager.updateBundleContext(RequireJsConfigManager.java:82)
        at org.pentaho.js.require.RequireJsConfigManager.init(RequireJsConfigManager.java:218)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.aries.blueprint.utils.ReflectionUtils.invoke(ReflectionUtils.java:297)
        at org.apache.aries.blueprint.container.BeanRecipe.invoke(BeanRecipe.java:958)
        at org.apache.aries.blueprint.container.BeanRecipe.runBeanProcInit(BeanRecipe.java:712)
        ... 21 more
2015-11-06 12:36:55,648 ERROR [org.pentaho.di.osgi.KarafLifecycleListener] Error in Blueprint Watcher
org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Unknown error in KarafBlueprintWatcher
        at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:89)
        at org.pentaho.di.osgi.KarafLifecycleListener$2.run(KarafLifecycleListener.java:112)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.pentaho.osgi.api.IKarafBlueprintWatcher$BlueprintWatcherException: Timed out waiting for blueprints to load: pentaho-big-data-api-hdfs,pentaho-big-data-impl-vfs-hdfs,pentaho-big-data-impl-clusterTests
        at org.pentaho.osgi.impl.KarafBlueprintWatcherImpl.waitForBlueprint(KarafBlueprintWatcherImpl.java:77)
        ... 2 more
2015-11-06 12:40:55,712 ERROR [org.apache.aries.blueprint.container.BlueprintContainerImpl] Unable to start blueprint container for bundle pentaho-big-data-impl-vfs-hdfs due to unresolved dependencies [(objectClass=org.pentaho.bigdata.api.hdfs.HadoopFileSystemLocator)]
java.util.concurrent.TimeoutException
        at org.apache.aries.blueprint.container.BlueprintContainerImpl$1.run(BlueprintContainerImpl.java:336)
        at org.apache.aries.blueprint.utils.threading.impl.DiscardableRunnable.run(DiscardableRunnable.java:48)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
2015-11-06 12:40:55,797 ERROR [org.apache.aries.blueprint.container.BlueprintContainerImpl] Unable to start blueprint container for bundle pentaho-big-data-impl-clusterTests due to unresolved dependencies [(objectClass=org.pentaho.bigdata.api.hdfs.HadoopFileSystemLocator)]
java.util.concurrent.TimeoutException
        at org.apache.aries.blueprint.container.BlueprintContainerImpl$1.run(BlueprintContainerImpl.java:336)
        at org.apache.aries.blueprint.utils.threading.impl.DiscardableRunnable.run(DiscardableRunnable.java:48)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

Pentaho User Console Issues With Horizontal Bars

$
0
0
Guys,

I have dashboards created unfortunately the horizontal bars on some panels do not scroll on few of them. Any ideas what and where things need to be changed. Would appreciate any reply.

Thanks

Publicando el Reporte en la Consola de Usuario Pentaho

$
0
0
Saludos, espero puedan ayudarme.


La situación es esta, he generado un reporte agrupando datos los cuales al hacer un preview con Pentaho Report Designer muestran mis datos agrupados, mediante el siguiente orden:
Agrupa por 1.institución de la persona, 2.nombre de la persona, nombre del tipo de clasificación de muestras y finalmente con las muestras de laboratorio con las que ha trabajado.


Como mencionaba, al correrlo en PRD muestra los datos agrupados, pero al publicarlo en la Consola de usuario de Pentaho, ya no me muestra todos los datos agrupados. Muestra agrupados por institución y nombre de la persona pero ya no agrupa desde el datos de tipo de clasificación, es decir, si tiene tipo de clasificación A1 y A2, muestra todos los datos de institución y nombre de persona por cada tipo de clasificación:


Así muestra los datos publicados:
Ejemplo.
Institución 1.
Persona 1.
Tipo A1.
Muestra de lab a1


Institución 1.
Persona 1.
Tipo A2
Muestra de lab a2.


Cuando debería mostrarlos así: (EN PRD si lo hace)
Ejemplo
Institución 1.
Persona 1.
Tipo A1.
Muestra de lab 1
Tipo A2.
Muestra de lab 2.




Los datos que consumo son del manejador PostgreSQL. Mi consulta la realicé a través de UNION debido al diseño de la base de datos.

Automatic Pagination - CDE - Table Component - CDA Data Source Query

$
0
0
Dear all,

I was looking for one way to automatize the pagination of the Table Component (CDA Data Source query), that is, I need to update (between time intevals) the rows showed in one page, to the rows from the next page, automatically, without user interaction clicking on Next Page button.
I was thinking that pagination on server-side could do this, but it is not working.
Is there some js function to do this in datatables?

BI-SERVER VERSION: 5.4.0.1 130

Regards,

Amom

LDAP Input return result when not found user

$
0
0
Hi,

I want to check if user exists in Active Directory using LDAP Input and search by samaccountname. How can I check if the query result returns 0 row (user not found)?

Thank you.

REST Client not working with biserver 6.0

$
0
0
Hello,

I had a transformation that worked with biserver 5.3, but now it doesn't work properly.

I try to create some users using the api. I do a PUT with http://***:8080/pentaho/api/userroledao/createUser and a json in the body.

With some users it works, but sometimes I get a 412 code saying that a precondition failed.

What can be the problem?

Thanks.

HTTP Post Soap Envelope Error

$
0
0
I'm running a HTTP Post to a SharePoint web service and getting error below

<?xml version=""1.0"" encoding=""utf-8""?><soap:Envelope xmlns:soap=""http://www.w3.org/2003/05/soap-envelope"" xmlns:xsi=""http://www.w3.org/2001/XMLSchema-instance"" xmlns:xsd=""http://www.w3.org/2001/XMLSchema""><soap:Body><soap:Fault><soap:Code><soap:Value>soap:Receiver</soap:Value></soap:Code><soap:Reason><soap:Text xml:lang=""en"">Server was unable to process request. ---&gt; Data at the root level is invalid. Line 1, position 1.</soap:Text></soap:Reason><soap:Detail /></soap:Fault></soap:Body></soap:Envelope>"

Below is the soap envelope file that i'm posting; have looked at the provided samples but none seem to be working- has anyone used this step successfully and can help me out where i'm going wrong ? (Tried the Web Services Lookup step too but gave Response 400 body error...) :( :( :(

"<?xml version=""1.0"" encoding=""UTF-8""?>
<soapenv:Envelope xmlns:soapenv=""http://schemas.xmlsoap.org/soap/envelope/"" xmlns:soap=""http://schemas.microsoft.com/sharepoint/soap/"">
<soapenv:Header/>
<soapenv:Body>
<soap:GetListItems>
<soap:listName>{A5E87CAF-3FC2-486E-B967-EA7A06FA76ED}</soap:listName>
<soap:viewName>{F008E87F-A438-41A5-A4E9-9092C4D1ADB6}</soap:viewName>
<soap:query><Query><Where><Gt><FieldRef Name=""ID"" /><Value Type=""Counter"">0</Value></Gt></Where></Query></soap:query>
<soap:viewFields><ViewFields> <FieldRef Name='Title' /> <FieldRef Name='Asset_Category' /><FieldRef Name='Sundial_Tag' /><FieldRef Name='Service_Tag' /><FieldRef Name='Model' /><FieldRef Name='Operating_System' /><FieldRef Name='User' /><FieldRef Name='Department' /></ViewFields></soap:viewFields>
<soap:rowLimit>100</soap:rowLimit>
<soap:queryOptions><QueryOptions><IncludeMandatoryColumns>FALSE</IncludeMandatoryColumns><DateInUtc>TRUE</DateInUtc></QueryOptions></soap:queryOptions>
<soap:webID>619f3bb5-093c-4c8f-a6aa-776c172a91cd</soap:webID>
</soap:GetListItems>
</soapenv:Body>
</soapenv:Envelope>"

Show percent value on bar chart

Spoon Sqoop import from postgres

$
0
0
---- snip ------

sorry, wrong forum..

Error sqoop import from Postgres to hive hadoop

$
0
0
Dear developers,

I have error from sqoop import from postgres database to hive table in hadoop.

When i use hadoop sqoop import manual via hue, I have this sqoop import script:
import --connect jdbc:postgresql://ipaddress:5432/database --username=postgres --password=xxxxxx --table some_table --target-dir /user/someuser/folder --fields-terminated-by \t --verbose -m 1 -- --schema some_schema

In spoon I use advanced setup and put this script into the entry box, then I have error regarding the option schema.
When I use quick setup I have error that the some_table was not found.
I realize this error when I use manual sqoop import in hue the same error because the schema was not defined

I think you must add functionality to add the schema from postgres connection.

BTW, I use PDI 6.0.0 and Hadoop CDH 5.4 and postgres 9.1

Regards,
Fadjar Tandabawana

Sap bw

$
0
0
Friends, someone has connected the pentaho to SAP BW?


Thank you!

parameters in sub mapping step

$
0
0
Hi,

i'm trying to understand the following in PDI 5.4.
I have a main transformation, which uses the simple mapping step to execute another transformation.
In the settings of the simple mapping step, i set a parameter with a static string value.

In the subtransformation itself i declare the parameter with no default value.
Then i have a get variables step to get the parameter and after that a logging step to see the value of the parameter.

The static value i set in the parameter tab is NOT shown on the logging. The same when i use the 'normal' sub mapping step.

We found that the option: [Inherit all variables from the parent transformation] must be UNSET in order for above example to work. Which is weird, because inherit means the all variables INCLUDING the one you're setting now, will be passed.

From the [HELP] we're not getting any wiser:
Code:

Inherit all variables from the parent transformation: If this option is checked, all variables available in the parent transformation will be available in the sub-transformation, even if they are not explicitly specified in the Parameters tab.
IMPORTANT!! : Only those variables/values that are specified are passed down to the sub-transformation.

Can anyone help me to understand this? Is this a bug, or am i missing something?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>