Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Auto include feature

$
0
0
Hi ,

I am new to the Ctools . I have seen Pedro's blog on auto include , but I am afraid , i did not understand it completely.

This is the usecase I am looking for :

I have a CDE dashboard page with one selector (I have many , but for the example lets say one :) )

My selector will be filled by a KTR file.

I know how to link it using the normal way ( Create a data source which points to the KTR file )

If I have a project folder "projectA" under "public" directory and a "cdf" folder and it has "includes/public/projectA/myktr.ktr" , how can I use the autoinclude feature to use this KTR to source data for selector ? how the selector definition look like in this case ?

Open crosstab editor after you are finished?

$
0
0
Is it possible to open the editor again Crosstab? (From a cross tab already created).

Postgre psql bulk loader

$
0
0
I am trying to use the Postgre Bulk Loader and received this messages:

Error in step
org.pentaho.di.core.exceptio.KettleException
Error while executing psql: “C:\Program Files\PostgreSQL\9.1\pgAdmin III” –U osmar –h localhost –p 5432 sbs_hemopa_local
Executable name has embedded quote, split the arguments


Specification of the step:
Step name: PostgreSQL Bulk Loader
Connection: conexao_sbs_hemopa
Target schema: public
Target table: obj220
Path to the psql cliente: ${psql_path}
Load action: Truncate
DB Name Override: (empty)
Enclusure: (empty)
Delimite: |
Stop on error: (not marked)


I inserted this line in the kettle.properties:
psql_path=C:\\Program Files\\PostgreSQL\\9.1\\pgAdmin III\\

What am I doing wrong?

Thanks for help.

Osmar Mateus

Portrait and landscape in the same report

$
0
0
Hi,



I need to make a report with some pages in portrait and some pages in landscape. Has a way to do that? Maybe using subreport or editing the xml.



Thanks.

fuzzy match bug?

$
0
0
I want to lookup the postcodes to a user generated "area" field. Matching works ok, but the returned field seems scrambled. I get strings like this: "B@4fde0ff6" instead of 4 digit postcodes. I attach the transformation and a sample of the sourcefiles. Can someone please check what went wrong?
Attached Files

newbie - data visualization from MongoDB on ARM

$
0
0
Hello,
I like to do home automation and I have sensor data collected in a Mongodb database. Things like temperature and energy usage. I'd like to be able to make charts of my energy usage, maybe put temperature and energy usage on the same chart.


Is Pentaho a good tool for this?


My system: Mongodb is installed on a ARM-based single board computer (Cubieboard2 running Ubuntu). I can envision two ways of getting to the visualization:


1) Server based: the visualization runs on a server, and I use the web browser to select datapoints of interest. Using the web interface, I would zoom in/out, etc... But this would require the visualization program to be installable on an ARM device running Ubuntu.


2) Client based: I install the program on my PC, which queries the Mongodb database for info. Charts are generated locally from the PC, no web interface.


I would prefer the Server option, since it seems I would be able to pull up charts anywhere I want instead of only at my desk. But I'm ok if only the desktop solution is possible. Is Pentaho PDI the right tool? Or maybe you can point me in the right direction for my data visualization needs if Pentaho is not the right fit?

Metadata published to Server not available to plug-ins

$
0
0
Hi all,

I am using bi server 5.0.1 and PME 5.1.0. Created a metadata file in PME and published it to the server. On the server, my domain shows up in the list of data sources as a Metadata file. However this is not available to the Pivot4J and Saiku plug-ins that I have. The Steel Wheels and SampleData do show up in the plug-ins though. Please help.

Regards,
DS

Schema Workbench: when Publish I receive this error: 401 Unauthorized

$
0
0
Hello,

When I use PWSchema, I cant publish because I receive this error:



This is the log:
jul 21, 2014 2:51:03 PM org.pentaho.mondrian.publish.workbench.PublishUtil fetch ReservedChars
ADVERTENCIA: Reserved character call failed: GET http://localhost:8080/pentaho/api/re...rvedCharacters returned a response status of 401 Unauthorized

BD: Mysql

Please advices.

[Help] [Begginer] [Formula] Attribution

$
0
0
Hi, guys.

Look at this wrong formula: IF([Type of Node] = "Enterprise"; [EnterpriseCNPJ] !!! [CNPJ];"")

Where is !!! (my main doubt), I'm trying to attribute some values from EnterpriseCNPJ to CNPJ (integer values) just if they're from the Type of Node (string) called Enterprise (string). "" is to put NULL if they're not (not placing the row would be better in this case, but I don't know how to do). Thanks.

Error in scheduler for send e-mail

$
0
0
Hi,
I'm using Pentaho 4.8 and I tring scheduler a report to send for e-mail, but ocorre the follow error

RuntimeContext.ERROR_0018 - [pt_39] The requested parameter data could not be fulfilled

Anybody have a ideia what happend ?

Is this a bug?

$
0
0
When using the Synchronize After Merge step, I find that PDI seems to freeze up when I enter the name and schema of the target table. Looking on the MySQL server, I can see that PDI has run a "Select *" from the target table. The table is HUGE, so it's no wonder PDI is freezing. This ought to be a select * with a LIMIT statement on it.

It seems like a bug to me. Can anyone tell me the right way to report it? Or if it's not a bug, could anyone explain why it would need to select the entire table?

Thanks!

Scott

Row Denormalizer has only last row as result

$
0
0
Dear Community,

I am pretty much of a newbie with PDI, so my excuses for maybe asking something very basic.

I am trying to denormalize data from an xml file, but the output from the denormalizer step is just the last row of the xml file. While processing the "get data from XML" step I can see in the metrics step that all the rows in the XML file are processed. Also the denormalizer reads all the lines from the file, but once it goes to write it to the text file output step, it has the header row, and only the last xml entry.

The version of PDI is 5.0.1 on Ubuntu 12.04

Thanks a lot in advance,
Bulk

denorm_test.ktr
denorm_test.txt
denorm_test.xml
Attached Files

import arabic data to phpmyadmin

$
0
0
hi,
I made a transformation to import csv file to phpmyadmin. My csv file contains arabic words and it is encoded utf-8, but in phpmyadmin i have ???????????.
I tried to import the csv file directly to phpmyadmin and every thing is ok.
Help please

I cant create DataSource Using MS SQLServer 2008 in Pentaho BI Server 5.1

$
0
0
Hi,

I cant create a DataSource in Pentaho BI 5.1 using MSSqlServer 2008, I use jtds 1.3. is not possible connect succesfull this DataSource.

Anybody try this before ?

Please advices.

Can this work using ODBC?

Regards

Amazon EC2 with Weka

$
0
0
I am using Weka right now to do some data mining. However, because I have over 10,000 instances and 2000 attributes. My computer can not handles that much data(it takes 3 days to compute the data!!!!). So I would like to know what EC2 instance I should go with that can compute this much data within 5-10 minutes.
Thank you
Additional Information.(on my LAPTOP) 10,000 Instances
2,000 Attributes
Logistic Regression Model.
10- Fold cross validation
SSD Hard drive (on my laptop)
Intel Core I7 CPU Q740 1.73GHz (on my laptop)
8GB RAM 64Bit weka software

Update: I have created an r3.2xlarge which has 60gb of ram and 8vCPUs and it still slower than my laptop. :(:mad:

Sap bobj 4.1 online training

$
0
0
Online Training SAP BUSINESS OBJECTS (BO) with BI

Batch Starting: Next Week Duration of Course: 30 Working Days
Duration of Class: 60 minutes a day.
The course includes training material with lab exercises and 24/7 system access to SAP BO and BI.
Special points to note in training:
1. Training will be in latest 7.0(Netweaver) versions including with BI and BO Integration.
2. I will help you in resume preparation and Interviews.
3. Training will be based on real time projects that I’ve worked before. Emphasis is given on important topics that were required and mostly used in real project implementations.
4. Training will cover both Functional and technical aspects of BO.

SAP BO is the new HOT Module in SAP. Make use of the training opportunity provided by experienced professional.
Here are the key elements of my training to be considered while comparing to other training places.
• 24/7 System Access for the entire month of training (additional access can be purchased for a minimal cost) as opposed to 5 days system access provided by SAP Academy
• Access to both Netweaver 7.0 and BO systems
• Experienced faculty as opposed to a SAP Academy trainer who doesn’t have real-time experience and is a full time trainer.
• Access to faculty even after the training is completed
• Cost is 1/5 of the amount spent for SAP Academy training
• Support in Resume and Interview preparation
• On-job support
• This is purely online and can attend from anywhere in the World.

Pre Requisites: SAP BI or BO or Exp. in any Data Warehouse Technologies.

Some of the SAP BO Course topics that covered by our professionals:
1. Overview – System Set up and Administration
2. SAP BI Launch pad and Webi Reporting
3. Installation & Configuration
4. SAP BW integration with BO
5. Server administration
6. SAP Lumira and Design Studio
7. PMP,UMT and Scheduling
8. Integrated solution architecture
9. OLAP Universe design approach
10. Connectivity Options
11. Variables & Prompting
12. Hierarchies
13. Publishing
14 IDT and UDT
15 Dashaboard and Crystal Reports
16 Hana with BO
And many sub topics are there for more details please go through the website.

Please call us for the Demo Classes we have regular batches and weekend batches.
Contact Number: INDIA: +91 9972971235, USA :1 347 635 1422
Email: madhukar.dwbi@gmail.com
Web: http://www.onlinetrainingsapbo.com/S...se-Content.php

Installation problems: pentaho 5.1 ce on JDK 1.8

$
0
0
Hi all,

I'm very new to Pentaho, and I need to install it on my machine to start a BI project. I installed the last community version from the community website (biserver-ce-5.1.0.0-752), set the PENTAHO_JAVA_HOME to point to my JDK 1.8 installation and simply unzip the file and run

Code:

2014-07-22 00:02:48,669 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2014-07-22 00:02:48,671 ERROR [org.pentaho.platform.util.logging.Logger] misc-class org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager: PluginManager.ERROR_0011 - Failed to register plugin cgg
org.springframework.beans.factory.BeanDefinitionStoreException: Unexpected exception parsing XML document from file [C:\Pentaho\biserver-ce\pentaho-solutions\system\cgg\plugin.spring.xml]; nested exception is java.lang.IllegalStateException: Context namespace element 'annotation-config' and its parser class [org.springframework.context.annotation.AnnotationConfigBeanDefinitionParser] are only available on JDK 1.5 and higher
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:420)
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:342)
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:310)
    at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.getNativeBeanFactory(DefaultPluginManager.java:411)
    at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.initializeBeanFactory(DefaultPluginManager.java:439)
    at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.reload(DefaultPluginManager.java:189)
    at org.pentaho.platform.plugin.services.pluginmgr.PluginAdapter.startup(PluginAdapter.java:40)
    at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:398)
    at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:389)
    at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:368)
    at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:389)
    at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:77)
    at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:326)
    at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:323)
    at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:368)
    at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:323)
    at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:294)
    at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:207)
    at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:135)
    at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
    at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
    at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
    at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
    at org.apache.catalina.core.StandardService.start(StandardService.java:525)
    at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
    at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: java.lang.IllegalStateException: Context namespace element 'annotation-config' and its parser class [org.springframework.context.annotation.AnnotationConfigBeanDefinitionParser] are only available on JDK 1.5 and higher
    at org.springframework.context.config.ContextNamespaceHandler$1.parse(ContextNamespaceHandler.java:65)
    at org.springframework.beans.factory.xml.NamespaceHandlerSupport.parse(NamespaceHandlerSupport.java:69)
    at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1297)
    at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1287)
    at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:135)
    at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.registerBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:92)
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.registerBeanDefinitions(XmlBeanDefinitionReader.java:507)
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:398)
    ... 42 more


I've done some research and found that I have to copy the content of the directory biserver-ce/pentaho-solutions/system/cgg/lib into tomcat/ into biserver-ce/tomcat/webapps/pentaho/WEB-INF/lib/. I did all of that, and after re-starting pentaho, I got the following error:


Code:

2014-07-21 23:34:13,281 ERROR [org.springframework.web.context.ContextLoader] Context initialization failedorg.springframework.beans.factory.BeanDefinitionStoreException: Parser configuration exception parsing XML from file [C:\Pentaho\biserver-ce\pentaho-solutions\system\pentaho-spring-beans.xml]; nested exception is javax.xml.parsers.ParserConfigurationException: Unable to validate using XSD: Your JAXP provider [org.apache.crimson.jaxp.DocumentBuilderFactoryImpl@6e171cd7] does not support XML Schema. Are you running on Java 1.4 with Apache Crimson? Upgrade to Apache Xerces (or Java 1.5) for full XSD support.
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:412)
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:342)
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:310)
    at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:143)
    at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:178)
    at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:149)
    at org.springframework.web.context.support.XmlWebApplicationContext.loadBeanDefinitions(XmlWebApplicationContext.java:124)
    at org.springframework.web.context.support.XmlWebApplicationContext.loadBeanDefinitions(XmlWebApplicationContext.java:92)
    at org.springframework.context.support.AbstractRefreshableApplicationContext.refreshBeanFactory(AbstractRefreshableApplicationContext.java:123)
    at org.springframework.context.support.AbstractApplicationContext.obtainFreshBeanFactory(AbstractApplicationContext.java:422)
    at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:352)
    at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:255)
    at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:199)
    at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:45)
    at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
    at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
    at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
    at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
    at org.apache.catalina.core.StandardService.start(StandardService.java:525)
    at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
    at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: javax.xml.parsers.ParserConfigurationException: Unable to validate using XSD: Your JAXP provider [org.apache.crimson.jaxp.DocumentBuilderFactoryImpl@6e171cd7] does not support XML Schema. Are you running on Java 1.4 with Apache Crimson? Upgrade to Apache Xerces (or Java 1.5) for full XSD support.
    at org.springframework.beans.factory.xml.DefaultDocumentLoader.createDocumentBuilderFactory(DefaultDocumentLoader.java:102)
    at org.springframework.beans.factory.xml.DefaultDocumentLoader.loadDocument(DefaultDocumentLoader.java:70)
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:396)
    ... 37 more
Caused by: java.lang.IllegalArgumentException: No attributes are implemented
    at org.apache.crimson.jaxp.DocumentBuilderFactoryImpl.setAttribute(DocumentBuilderFactoryImpl.java:93)
    at org.springframework.beans.factory.xml.DefaultDocumentLoader.createDocumentBuilderFactory(DefaultDocumentLoader.java:99)
    ... 39 more

What am I doing wrong? Do I have to install other tools? Is there a clear installation manual?
Thanks a lot,
Raffaele

Pentaho Biserver calling external lib

$
0
0
- I'm using BI server with external Service lib (written by me) with Scriptable Datasource.
- I put logs in my Service
- When I generate Report, I saw the my service was called multiple times. (more than 4 times)
SO my question is Is this normal or not ? If this is abnormal then how to fix this cause the low performance

Online SAP Workflow Training by Vast Experienced professional trainers

$
0
0
SAP Workflow Online Training by 123Trainings is the best opportunity for students. SAP Workflow Online Training is one of the most in demand data analysis module to build career. All our students were happy and able to find Jobs quickly in USA, Canada. 123Trainings is your one stop solution to learn SAP Workflow Online Training at the comfort of your home with flexible timings. Our SAP Workflow Online trainers come with vast work experience and teaching skills. Our SAP Workflow Online Training is regarded as the one of the best online training in World. Learners can grasp the technology-subject from our highly experienced & certified trainers which will be helping the students to work in real time projects we are into this field with passion and dedication we are in to online trainings from many years.

Some of the SAP Workflow online training Course topics that covered by our professionals:

  1. Initial Workflow Configurations
  2. Tasks & Workflow Templates
  3. Start Condition & Check Function Module
  4. Mail Tasks & Notification Program
  5. Configuring Webdynpro application for WF
  6. Workflow with Classes (Object Oriented)
  7. Workflow Administration
  8. Workflow Monitoring
  9. Workflow Troubleshooting
  10. Workflow Reports, Tables & FMs


And many sub topics are there for more details please go through the website.

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Details: India: +91 97000-10111 USA: +1 209-207-3642

Email: 123onlineclasses@gmail.com

Web: http://123trainings.com/it-sap-workf...-training.html

Online Oracle 11g DBA Training in USA | Online Oracle 11g DBA training in UK

$
0
0
Oracle 11g DBA is the latest and most sophisticated Oracle Database.123 Trainings is the best Oracle 11g DBA online training provider. Oracle Database technology is used by the world’s leading companies to manage critical business functions, systems, and processes. Through this Continuing Education program, you will build a solid foundation in Oracle Database 11g, the industry’s most recognized and advanced database management system, and master global standards through Oracle Database 11g Administrator Certification.


Oracle 11g DBA Course content:


  • Exploring the Oracle Database Architecture
  • Preparing the Database Environment
  • Creating an Oracle Database
  • Managing the Oracle Instance
  • Configuring the Oracle Network Environment
  • Managing Database Storage Structures
  • Administering User Security
  • Managing Undo Data
  • Implementing Oracle Database Security
  • Database Maintenance
  • Performance Management
  • Backup and Recovery Concepts
  • Performing Database Backups
  • Configuring for Recoverability
  • Using RMAN to Perform Recovery


And many sub topics are there for more details please go through the website.

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact number : India : +91 97000-10111
USA : +1 209-207-3642

Email : 123onlineclasses@gmail.com
Web : http://123trainings.com/it-oracle-11...-training.html
Viewing all 16689 articles
Browse latest View live