Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Weird conversion String to Date on offset location

$
0
0
Hi everybody,
I'm currently reading values from an XML input stream, and deserializing the data to match the correct data type.

The problem I have comes from an error that I have deserializing one date in specific, below the error message:
"couldn't convert string [25/03/2018 02:11:00] to a date using format [dd/MM/yyyy HH:mm:ss] on offset location 19".

This looks pretty weird to me, since as far as I can see the format matches exactly the Date format in the XML file (<MY_DATE>25/03/2018 02:11:00</MY_DATE>).

Anyone experienced it already?

Many thanks!

Address already in use: JVM_Bind. Unable to list jar files in plugin folder 'plugins'

$
0
0
Hello, Pentaho seams to be always good for wasting time. Does someone see what is wrong and why I don't get any version? The mass on error messages overflows my brain.

Execute command:

c:\godesys\Server\Pentaho\design-tools\data-integration>Kitchen.bat /version

Disturbing Result:


DEBUG: Using PENTAHO_JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\godesys\Server\Pentaho\java
DEBUG: _PENTAHO_JAVA=C:\godesys\Server\Pentaho\java\bin\java.exe
DEBUG: PENTAHO_INSTALLED_LICENSE_PATH=C:\godesys\Server\Pentaho\.installedLicens
es.xml


c:\godesys\Server\Pentaho\design-tools\data-integration>"C:\godesys\Server\Penta
ho\java\bin\java.exe" "-Dpentaho.installed.licenses.file=C:\godesys\Server\Penta
ho\.installedLicenses.xml" "-Xms1024m" "-Xmx2048m" "-XX:MaxPermSize=256m" "-Dht
tps.protocols=TLSv1,TLSv1.1,TLSv1.2" "-Djava.library.path=libswt\win64" "-DKETTL
E_HOME=" "-DKETTLE_REPOSITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_
PLUGIN_PACKAGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" "-Dpentaho.in
stalled.licenses.file=C:\godesys\Server\Pentaho\.installedLicenses.xml" -jar lau
ncher\pentaho-application-launcher-7.1.0.0-12.jar -lib ..\libswt\win64 -main or
g.pentaho.di.kitchen.Kitchen -initialDir "c:\godesys\Server\Pentaho\design-tools
\data-integration"\ /version
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; sup
port was removed in 8.0
11:40:08,944 INFO [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache
is enabled
11:40:10,121 ERROR [ServerSocketBasedKarafInstanceResolver] Error creating Serve
rSocket
java.net.BindException: Address already in use: JVM_Bind
at java.net.DualStackPlainSocketImpl.bind0(Native Method)
at java.net.DualStackPlainSocketImpl.socketBind(Unknown Source)
at java.net.AbstractPlainSocketImpl.bind(Unknown Source)
at java.net.PlainSocketImpl.bind(Unknown Source)
at java.net.ServerSocket.bind(Unknown Source)
at java.net.ServerSocket.<init>(Unknown Source)
at java.net.ServerSocket.<init>(Unknown Source)
at org.pentaho.platform.osgi.ServerSocketBasedKarafInstanceResolver.reso
lveInstanceNumber(ServerSocketBasedKarafInstanceResolver.java:207)
at org.pentaho.platform.osgi.ServerSocketBasedKarafInstanceResolver.reso
lveInstance(ServerSocketBasedKarafInstanceResolver.java:65)
at org.pentaho.platform.osgi.KarafInstance.assignPortsAndCreateCache(Kar
afInstance.java:148)
at org.pentaho.platform.osgi.KarafBoot.createAndProcessKarafInstance(Kar
afBoot.java:329)
at org.pentaho.platform.osgi.KarafBoot.startup(KarafBoot.java:224)
at org.pentaho.di.osgi.registryExtension.OSGIPluginRegistryExtension.ini
t(OSGIPluginRegistryExtension.java:109)
at org.pentaho.di.core.plugins.PluginRegistry.init(PluginRegistry.java:5
96)
at org.pentaho.di.core.KettleClientEnvironment.init(KettleClientEnvironm
ent.java:115)
at org.pentaho.di.core.KettleClientEnvironment.init(KettleClientEnvironm
ent.java:79)
at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:91)
at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:84)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
11:40:14,179 INFO [KarafInstance]
*******************************************************************************
*** Karaf Instance Number: 2 at c:\godesys\Server\Pentaho\design-tools\data ***
*** -integration\.\system\karaf\caches\kitchen\data-1 ***
*** FastBin Provider Port:52902 ***
*** Karaf Port:8803 ***
*** OSGI Service Port:9052 ***
*******************************************************************************
Mõr 28, 2018 11:40:15 AM org.apache.karaf.main.Main$KarafLockCallback lockAquire
d
INFORMATION: Lock acquired. Setting startlevel to 100
2018/03/28 11:40:16 - Kitchen - Kettle version 7.1.0.0-12, build 1, build date :
2017-05-16 17.18.02
2018/03/28 11:40:16 - Kitchen - Start of run.
ERROR: Kitchen can't continue because the job couldn't be loaded.
org.pentaho.di.core.exception.KettleFileException:


Unable to list jar files in plugin folder 'plugins'




Unable to get VFS File object for filename 'plugins' : Could not find file with
URI "c:\godesys\Server\Pentaho\design-tools\data-integration\plugins" because it
is a relative path, and no base URI was provided.










Unable to get VFS File object for filename 'plugins' : Could not find file with
URI "c:\godesys\Server\Pentaho\design-tools\data-integration\plugins" because it
is a relative path, and no base URI was provided.






at org.pentaho.di.core.plugins.PluginFolder.findJarFiles(PluginFolder.ja
va:144)
at org.pentaho.di.core.plugins.PluginFolder.findJarFiles(PluginFolder.ja
va:117)
at org.pentaho.di.core.plugins.JarFileCache.getFileObjects(JarFileCache.
java:67)
at org.pentaho.di.core.plugins.BasePluginType.findAnnotatedClassFiles(Ba
sePluginType.java:253)
at org.pentaho.di.core.plugins.BasePluginType.registerPluginJars(BasePlu
ginType.java:556)
at org.pentaho.di.core.plugins.BasePluginType.searchPlugins(BasePluginTy
pe.java:120)
at org.pentaho.di.core.plugins.PluginRegistry.registerType(PluginRegistr
y.java:636)
at org.pentaho.di.core.plugins.PluginRegistry.init(PluginRegistry.java:6
08)
at org.pentaho.di.core.plugins.PluginRegistry.init(PluginRegistry.java:5
75)
at org.pentaho.di.core.KettleEnvironment.init(KettleEnvironment.java:136
)
at org.pentaho.di.core.KettleEnvironment.init(KettleEnvironment.java:98)


at org.pentaho.di.core.KettleEnvironment.init(KettleEnvironment.java:79)


c:\godesys\Server\Pentaho\design-tools\data-integration>

Switch Language again and again

$
0
0
Hello.

When I switch the language, the next login the language is set back.
Tested on Pentaho 7.1 Business Edition and Penatho 8.0 Community Edition.

Seams to be another great feature, or does it work for you?

Kettle jobs going zombie

$
0
0
We've been having some intermittent hanging of jobs lately with DI 6.1 that has been causing us a fair amount of trouble. Unfortunately, we haven't been able to narrow down any cause for it so I am hoping someone here can help us.

The jobs in question are scheduled ETL jobs that run every half hour on our Linux servers and move data between our transactional db and the datamart db. Since we have some larger jobs that also run on those boxes and use some of the same tables we have a system set up that prevents new jobs from kicking off if any kitchen job is still running when it starts up. These processes usually only take about 7-15 minutes to run, so having one of them get stuck and hold everything up for hours causes a lot of problems.

Has anyone else had issues with kettle jobs turning zombie on Linux? They can't be stopped normally, we have to use kill -9 on them to get rid of them. We have had a few occasions where a process will recover after a few hours and finish but usually they've gone full zombie and need to be killed in order to free up our system and let the jobs run on schedule again.

So far we haven't seen any errors or warnings in the logs when this happens, even with logging changed to detailed. The process simply stops responding or logging anything at that point. Memory and CPU usage on the linux boxes doesn't seem to be spiking or maxed out, nor are the MySQL boxes showing any locking issues or any other problems in their logs.

Another complicating factor is that they do not seem to hang at any single point in the jobs or at any consistent time. I've seen them hang in the middle of jobs, right after jobs, and even after the very last job/transaction of the overall project when there is literally no work left to be done.

I'm at my wits end here, so any suggestions would be welcome!

How to Schedule a PDI Job to be Quarterly?

$
0
0
This might just be a simple answer that I'm not seeing in the standard PDI scheduling options but I'm trying to schedule a Job to run every 3 months rather than monthly.
I can see under the Monthly option that I can only select a specific day of each month.
Under the Weekly option there is the ability to choose to run a Job every X days, it would be nice if there was an equivalent ability under the Monthly option and then I could choose to run it every 3 months. However, this is not an option.

Is there something I'm missing or will I have to try schedule it Monthly and then build some extra logic into my Job to determine if the Job is running for the first time in a Quarter and suppress the output of the results if it is not the first time?

Ed

Using Samba as Active Directory & configuring it with pentaho user console 7.1

$
0
0
I am using Samba as Active Directory & trying to configure it with pentaho user console 7.1 same as ldap.
But its giving a error "Strong Authentication Required: Bind Simple: Transport encryption required"




Can anyone help me out with this error?

Pentaho caches

$
0
0
Hello,

we have a problem, supposedly with the Mondrian cache of Pentaho Server or with the CDE cache, we are not sure which of them.

Every night, we execute a series of kettle ETLs that load data in a MySQL database, through which we have generated cubes, which have been published in the Pentaho Server. We have some dashboards, with bar charts, etc, that run MDX queries through the previous cubes.

Well, the next day, when we enter Pentaho Server to view the dashboards, we see that the graphics do not correspond to the data of the MySQL database. To make the data correspond, we must always go to the "Tools -> Refresh" menu and select all the refresh options in this menu, System Settings, Reporting Data, Global Variables, Mondrian Schema Cache, Reporting Data Cache and CDA Cache. Sometimes, we even have to do it two or three times.

Once this is done, the data from the database and the dashboards already agree.

How could we solve this problem? Is there any way to refresh all these data from the ETL processes of kettle?. What other ways would I have to solve it?

Best regards.

Abort Job without error if "Execute SQL Statements" fails

$
0
0
I've got a strange issue that I'm hoping someone can help with.
We have many jobs scheduled concurrently that all call the same set of jobs/transformations.
The first step, is to update a row in a queuing table. The problem is that we're getting row locking from time to time because two different jobs are trying to pick the same row in the queue (and change it's status).
What I'd like to do is that if a row is locked (causing a SQL error: Error committing transaction: updates conflict with updates in another transaction.) that the job just abort without error.
We are getting many rows in our /kettle/status page that shows false positive errors. I mean they're actual errors, we just don't care about them.
If one fails, the next one will pick it up if it's not locked. Not a huge deal.
So, is there a way to have the job abort (without) error if the SQL error occurs?
Thanks!

CSV Export method From PentahoCE BI 6 to Pentaho CE BI 8

$
0
0
Good afternoon,

Last week we migrated from Pentaho CE 6 to Pentaho CE 8, and we migrated our app.
Everything is working like a charm except for exporting datatables with large data.

The issue is that in Pentaho 8 excel export takes from 5 to 15 min to export 15k to 60k lines, and as alternative CSV export takes seconds to do the same but Excel doesnt understand the default encoding of these files.

In Pentaho CE 6.1 exporting the same data to csv was being encoded correctly and read by excel.

In Pentaho 8 these are the following tests i've performed:
  • Export Component, with the behavior explanied previously.
  • Export Popup Component, with the behavior explanied previously.



  • Button Component with the ExportData function with the same results.


Code:

function exportData(){
    render_relatorio.queryState.exportData('csv', null,  {filename:'custom_name.csv'});
}


  • Modify "..\pentaho-server\pentaho-solutions\system\pentaho-cdf\js\queries\CdaQuery.js"


I've tried to modify the ajax call with the desired contentType, but for some reason all changes in this javascript dont reflect on the "..\pentaho-server\pentaho-solutions\system\pentaho-cdf\js\cdf-bootstrap-script-includes.js"

Code:

$.ajax({
        type: 'POST',
        dataType: 'text',
        async: true,
        data: queryDefinition,
      //contentType: "charset=ISO-8859-15",
      contentType: "charset=windows-1252"
        url: this.getOption('url'),
        xhrFields: {
          withCredentials: true
        }
      }).done(function(uuid) {
        var _exportIframe = $('<iframe style="display:none">');
        _exportIframe.detach();
        _exportIframe[0].src = CdaQueryExt.getUnwrapQuery({"path": queryDefinition.path, "uuid": uuid});
        _exportIframe.appendTo($('body'));
      }).fail(function(jqXHR, textStatus, errorThrown) {
        Logger.log("Request failed: " + jqXHR.responseText + " :: " + textStatus + " ::: " + errorThrown);
      });
    },


  • I've also tested this on Pentaho CE 7, with the same behavior as in 8 so it kinda feels like a intended behavior.


My question is there anyway to correct this behavior so that my client can open the csv automaticly in Excel?

I know it is possible to "import" the csv file to Excel and select the UTF-8 encoding and the file will load correctly, but my client is considering this as a non optimal solution for them.

Kinda regards.

Execute for every input row

$
0
0
Morning all,

I am having issues with a job which requires a step which executes for every input row, I have used this option previously with success but am struggling with this one. Please see screenshot below:



The process is successful for a single row, moving an image file into the designated folder.

It is not however looping through all filenames outputted in previous steps.

Any help is much appreciated!

Jimmy

PDI - Put a file with SFTP

$
0
0
Hi All,

I got the following error when the job want to upload file via 'Put a file with SFTP' step.
Does anybody has idea why it is happening, because previously it worked properly?

Thanks in Advance!

ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Error while trying to send files. Exception :
3: Permission denied
Permission denied
ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : org.pentaho.di.core.exception.KettleJobException:
3: Permission denied
Permission denied


at org.pentaho.di.job.entries.sftp.SFTPClient.put(SFTPClient.java:261)
at org.pentaho.di.job.entries.sftpput.JobEntrySFTPPUT.execute(JobEntrySFTPPUT.java:887)
at org.pentaho.di.job.Job.execute(Job.java:723)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:545)
at org.pentaho.di.job.Job.run(Job.java:435)
Caused by: 3: Permission denied
at com.jcraft.jsch.ChannelSftp.throwStatusError(ChannelSftp.java:2491)
at com.jcraft.jsch.ChannelSftp._put(ChannelSftp.java:528)
at com.jcraft.jsch.ChannelSftp.put(ChannelSftp.java:480)
at org.pentaho.di.job.entries.sftp.SFTPClient.put(SFTPClient.java:259)
... 12 more

Is there a way to create new/select predefined database connections dynamically?

$
0
0
Hello Pentaho Community,

I've gone through many posts in the past few days, however was unable to find a suitable solution for my challenge. I hope some one has done something similar in the past and would be willing to share...

I have a requirement to get hundreds of tables residing on different databases flattened and shared with other system. Instead of creating a separate transformation for each table i need to export, i created one job that loops through a driver file that contains all parameters including SQL, output file, etc for all tables; that works just fine. The part that i'm not sure on how to implement is how to create new/select pre-defined dynamic connection from same driver file or list of existing shared connections. When the process runs, it goes though each record on the driver table and create a text file. I'm looking for a way to pass either all components of a connection as a variable to create that connection at run time or select pre-defined connection based on driver entry. I'm using Table Input control to pull the data from a SQL passed via a variable and need to be able to create/select a connection that will correspond to source table residing either on SQL Server/Oracle or Sybase databases. We have shared kettle.properties file on each environment to maintain corresponding connection properties. I think, it would be a simpler task if the data came from one database, but i cannot think of a way to create a connection based on database type.

Appreciate your help in advance!

Permission denied when PDI use Put a file with SFTP

$
0
0
Hi All,

I use PDI 6.1 and when the job use Put a file with SFTP I got the following error message:

ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Error while trying to send files. Exception :
3: Permission denied
Permission denied
ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : org.pentaho.di.core.exception.KettleJobException:
3: Permission denied
Permission denied

at org.pentaho.di.job.entries.sftp.SFTPClient.put(SFTPClient.java:261)
at org.pentaho.di.job.entries.sftpput.JobEntrySFTPPUT.execute(JobEntrySFTPPUT.java:887)
at org.pentaho.di.job.Job.execute(Job.java:723)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:545)
at org.pentaho.di.job.Job.run(Job.java:435)
Caused by: 3: Permission denied
at com.jcraft.jsch.ChannelSftp.throwStatusError(ChannelSftp.java:2491)
at com.jcraft.jsch.ChannelSftp._put(ChannelSftp.java:528)
at com.jcraft.jsch.ChannelSftp.put(ChannelSftp.java:480)
at org.pentaho.di.job.entries.sftp.SFTPClient.put(SFTPClient.java:259)


Does anyboday has any idea why does it happen, previously it worked properly for me?

Thanks in advance!
Pali

Export Data from Google Cloud

$
0
0
hi..is there a way i can connect through table input to google cloud? or is there a way i can add the google sql query? pls help..thanks

Jobs stop for no reason

$
0
0
My client has a Windows Server 2012 R2 box with SQL Server 2016. We use PDI 7. The loads are run from Windows scheduler. It happens quite often that PDI stops somewhere in a job for no apparent reason. The lot of the transformation shows nothing wrong, just that it ends with STOP instead of END. This causes the job to stop prematurely and not doing all the transformations. This is one of the main reasons why my client now abandons PDI and goes (back) to SSIS. Who else has this issue and knows what causes it? I think it's [$#&^$#^#&^$} that it's so unreliable.

Parameter for column / table names in SQL query in Pentaho CDE dashboard

$
0
0
Hello,

I'm trying to use parameters for column and table name to SQL query over JNDY, which doesn't work, while parameters for values work fine.

I defined 3 simple parameter components with default values:

Code:

paramTable = 'dummyTable'
paramColumn = 'dummyColumn'
paramValue = 'dummyValue'

Then I defined an SQL query over JNDI, that looks like:

Code:

select * from ${paramTable} where ${paramColumn} = ${paramValue};
... which does not work, because parameter values of paramTable and paramColumn don't seem to apply properly, which causes:
Code:

Caused by: java.sql.SQLSyntaxErrorException: ORA-00903: invalid table name
Here are some query modifications, and their results really disappoint me:

Code:

select * from dummyTable where dummyColumn = ${paramValue};
-- works fine and delivers 1 row

select * from ${paramTable} where dummyColumn = 'dummyValue';
-- causes 'invalid table name'

select * from dummyTable where ${paramColumn} = 'dummyValue';
-- does not cause any errors, but if I console.log() the result set of the query,
  it is empty (1 row is expected though)

Do I miss something?



I appreciate any help.

Parameter for column / table names in SQL query in Pentaho CDE dashboard

$
0
0
Hi,

I'm trying to use parameters for column and table name to SQL query over JNDY, which doesn't work, while parameters for values work fine.

I defined 3 simple parameter components with default values:

Code:

paramTable = 'dummyTable'
paramColumn = 'dummyColumn'
paramValue = 'dummyValue'

Then I defined an SQL query over JNDI, that looks like:
Code:

select * from ${paramTable} where ${paramColumn} = ${paramValue};
... which does not work, because parameter values of paramTable and paramColumn don't seem to apply properly, which causes:
Code:

Caused by: java.sql.SQLSyntaxErrorException: ORA-00903: invalid table name
Here are some query modifications, and their results really disappoint me:

Code:

select * from dummyTable where dummyColumn = ${paramValue};
-- works fine and delivers 1 row

select * from ${paramTable} where dummyColumn = 'dummyValue';
-- causes 'invalid table name'

select * from dummyTable where ${paramColumn} = 'dummyValue';
-- does not cause any errors, but if I console.log() the result set of the query,
  it is empty (1 row is expected though)

Do I miss something?

I appreciate any help.

How can I obtain an OAuth access token using a private key in a JSON file? (Google)

$
0
0
Hello everyone, I'm convinced this situation already happened to some of you, so I'm here to request your help. I'm trying to access some of Google's API and the call must be authenticated with an OAuth token which I get in exchange for my private key. I have the private key in both JSON and P12 files. I'm not a Java developer and now I'm unsure on how to proceed.

I understand that Google provides API client libraries to get OAuth tokens for a number of languages, including Java, but I don't know how to add it to Pentaho Data Integration. I tried searching the forum and did some Google searches but I was unable to find a guide to implement this solution.

Can someone at least point me to the right direction? I have programming experience with JavaScript and PHP, but zero knowledge on Java.

Got Error after successful run

$
0
0
Hi All,

my job runs properly, files are on the expected place, transfomation are successfull as well so the scheduled job works as expected, but I always got an email after run as it fails!
Does somebody facing with the same issue?


Error message:

ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Error while trying to send files. Exception :
3: Permission denied
Permission denied
ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : org.pentaho.di.core.exception.KettleJobException:
3: Permission denied
Permission denied


at org.pentaho.di.job.entries.sftp.SFTPClient.put(SFTPClient.java:261)
at org.pentaho.di.job.entries.sftpput.JobEntrySFTPPUT.execute(JobEntrySFTPPUT.java:887)
at org.pentaho.di.job.Job.execute(Job.java:723)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:864)
at org.pentaho.di.job.Job.execute(Job.java:545)
at org.pentaho.di.job.Job.run(Job.java:435)
Caused by: 3: Permission denied
at com.jcraft.jsch.ChannelSftp.throwStatusError(ChannelSftp.java:2491)
at com.jcraft.jsch.ChannelSftp._put(ChannelSftp.java:528)
at com.jcraft.jsch.ChannelSftp.put(ChannelSftp.java:480)
at org.pentaho.di.job.entries.sftp.SFTPClient.put(SFTPClient.java:259)
... 12 more

Problemas al publicar cubo, se publica pero no se visualiza.

$
0
0
Hola.

Tengo un cubo publicado en una B.D. Oracle, que me funciona.

Se ha montado otro servidor de Base de Datos Oracle. En el esquema Workbench, se tiene en cuenta la nueva conexión, y el esquema de Oracle que también cambia.

Las tablas se han copiado a la nueva B.D.


En el esquema Workbench no da ningún error, publica correctamente, pero cuando en la consola del BI, creo un nuevo saiku , no me visualiza el cubo.

La conexión es correcta tanto en Workbench com enel BI, ya que el test me da ok.

En los logs de catalina o veo nada, que me indique que no le guste del cubo.

Cómo puedo averiguar donde esta el problema?

Gracias
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>