Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

how to solve the following exception

$
0
0
org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Failed at query: SELECT
`tbl_demo`.`id`,
`tbl_demo`.`name`
FROM
`tbl_demo`
at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SimpleSQLReportDataFactory.queryData(SimpleSQLReportDataFactory.java:210)
at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SQLReportDataFactory.queryData(SQLReportDataFactory.java:162)
at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:125)
at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryData(CompoundDataFactory.java:75)
at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryInternal(CachingDataFactory.java:432)
at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryData(CachingDataFactory.java:326)
at org.pentaho.reporting.engine.classic.core.designtime.DesignTimeDataSchemaModel.queryReportData(DesignTimeDataSchemaModel.java:385)
at org.pentaho.reporting.engine.classic.core.designtime.DesignTimeDataSchemaModel.buildDataSchema(DesignTimeDataSchemaModel.java:315)
at org.pentaho.reporting.engine.classic.core.designtime.DesignTimeDataSchemaModel.ensureDataSchemaValid(DesignTimeDataSchemaModel.java:244)
at org.pentaho.reporting.engine.classic.core.designtime.DesignTimeDataSchemaModel.getDataSchema(DesignTimeDataSchemaModel.java:154)
at org.pentaho.reporting.designer.core.editor.fieldselector.FieldSelectorPanel.computeColumns(FieldSelectorPanel.java:138)
at org.pentaho.reporting.designer.core.editor.fieldselector.FieldSelectorPanel$ReportModelChangeHandler.nodeChanged(FieldSelectorPanel.java:68)
at org.pentaho.reporting.engine.classic.core.AbstractReportDefinition.fireModelLayoutChanged(AbstractReportDefinition.java:1247)
at org.pentaho.reporting.engine.classic.core.MasterReport.updateChangedFlagInternal(MasterReport.java:512)
at org.pentaho.reporting.engine.classic.core.Element.notifyNodeChildAdded(Element.java:982)
at org.pentaho.reporting.engine.classic.core.MasterReport.setDataFactory(MasterReport.java:401)
at org.pentaho.reporting.designer.core.actions.report.AddDataFactoryAction.addDataFactory(AddDataFactoryAction.java:135)
at org.pentaho.reporting.designer.core.actions.report.AddDataFactoryAction.actionPerformed(AddDataFactoryAction.java:89)
at javax.swing.AbstractButton.fireActionPerformed(Unknown Source)
at javax.swing.AbstractButton$Handler.actionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.setPressed(Unknown Source)
at javax.swing.AbstractButton.doClick(Unknown Source)
at javax.swing.plaf.basic.BasicMenuItemUI.doClick(Unknown Source)
at javax.swing.plaf.basic.BasicMenuItemUI$Handler.mouseReleased(Unknown Source)
at java.awt.AWTEventMulticaster.mouseReleased(Unknown Source)
at java.awt.Component.processMouseEvent(Unknown Source)
at javax.swing.JComponent.processMouseEvent(Unknown Source)
at java.awt.Component.processEvent(Unknown Source)
at java.awt.Container.processEvent(Unknown Source)
at java.awt.Component.dispatchEventImpl(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.processMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.dispatchEvent(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Window.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.EventQueue.dispatchEventImpl(Unknown Source)
at java.awt.EventQueue.access$000(Unknown Source)
at java.awt.EventQueue$3.run(Unknown Source)
at java.awt.EventQueue$3.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(Unknown Source)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(Unknown Source)
at java.awt.EventQueue$4.run(Unknown Source)
at java.awt.EventQueue$4.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(Unknown Source)
at java.awt.EventQueue.dispatchEvent(Unknown Source)
at java.awt.EventDispatchThread.pumpOneEventForFilters(Unknown Source)
at java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.run(Unknown Source)
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'OPTION SQL_SELECT_LIMIT=1' at line 1
at sun.reflect.GeneratedConstructorAccessor95.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:406)
at com.mysql.jdbc.Util.getInstance(Util.java:381)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1031)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:957)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3376)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3308)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1837)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1961)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2537)
at com.mysql.jdbc.StatementImpl.executeSimpleNonQuery(StatementImpl.java:1463)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1368)
at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SimpleSQLReportDataFactory.parametrizeAndQuery(SimpleSQLReportDataFactory.java:374)
at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SimpleSQLReportDataFactory.queryData(SimpleSQLReportDataFactory.java:206)
... 56 more

Table Input Step with SQL query not returning results after migration to Linux

$
0
0
Hi guys
I wonder if you could help me please
I have latest version Pentaho Data Integration 5.0.1 installed on windows 7

I have a Table Input step which is running a moderately complex MySQL query, it has a number of joins.
The Step is within a transformation which takes these results and sends them on to a Text file output

Everything ran happily on my local machine.
MySQL was installed on this machine too

As soon as the database schema was migrated to MySQL Linux instance on a company central server. The Table step is now not bringing back any results.
The Transformation runs but you can see nothing but '0' zeros everywhere in the step metrics.
The connection setup on the Table Input step is definitely against the new Linux instance

When I press the preview button nothing returns.
I tried a simpler SQL query and results were returned from the Linux instance.

When I cut and paste this complex query in the sql window of MY SQL Workbench 6.0CE session against the schema on the linux server the query does return results as expected, so I know the query is correct. It take time about 114 seconds.

I am fairly new to Pentaho DI, its an amazing tool...so Im probably not doing things as efficiently as possible.
I have a number of transformations that are setup like this, running within a job producing reports from fairly complex SQL statements

I was wondering if any of the following reasons was the problem..

I was hoping the experts out there could help me understand if this is a memory related issue down to the fact i'm running my jobs on PDI locally installed on windows, but connecting to remote linux server ?
Should i set this up differently ?
Do I need to get my linux adminstrator to allocate more memory to the mysql instance ?
Can i update PDI memory usage somehow so it cope with bringing back results from complex SQL queries running on remote linux seerver ?
Any help would be gratefully received...Thank you for your time

Using a get a file with SFTP step and executing it in a job.

$
0
0
I have a transformation that get yesteradys date and I convert the date into a String of yyyy-mm-dd and use it that to derive a filename path with full qaulified file name path and extension.

I created two jobs: one job gets yesterdays the copy it in memory.

The second job takes the values of the first to get the file name from the remote server using a get file with SFTP step.

My runs fine but it does download the file as expected.

What am I not doing right?

Please help me out.

i have attached both the sample transformation and the logs.

Thanks,

Ron
Attached Files

Dashboards Created in Ctools 5.0 are not opening in other instance of Ctools 5.0

$
0
0
Hi All,

I have created a dashboard in latest CDE version which was integrated with my pentaho 4.5 bi server. I have integrated CTOOLS using the following

link :- "http://pedroalves-bi.blogspot.in/201...lable-cdf.html".

It was working fine in 4.5 BI Server. We had no issue's with that.

As we have migrated from Pentaho 4.5 to 5.0, and installed CDE. I have uploaded the dashboard files into my new BI Server,When i am trying to open these dashboards, It's a bit hard for me to say this but it is displaying me a CDA tab and showing query results.Can any please help in finding the root cause of this.

Thanks,
Santosh Bhagavatula

How to Access MySQL database with WEKA on OS X?

$
0
0
Hello,

I can't get WEKA 3.6.10 on OS X to access the MySQL database. Here's what I've done:
  1. Downloaded mysql-connector-java-5.1.30-bin.jar and put it into /Library/Java/Extensions
  2. Extracted the weka.jar. Edited weka/experiment/DatabaseUtils.props.mysql, weka/experiment/DatabaseUtils.props, weka/weka/experiment/DatabaseUtils.props.mysql and replaced the old JDBC driver class with com.mysql.jdbc.Driver. Put these files back into weka.jar with $ jar uf weka.jar <file>
  3. Made sure mysql server is running by accessing it from command line and listening on 3306 with telnet


When starting the WEKA explorer, I no longer get error messages about a missing MySQL driver:
$ cat ~/weka.log 2014-04-09 09:43:42 weka.gui.GUIChooser main
INFO: Logging started
---Registering Weka Editors---
Trying to add database driver (JDBC): RmiJdbc.RJDriver - Error, not in CLASSPATH?
Trying to add database driver (JDBC): jdbc.idbDriver - Error, not in CLASSPATH?
Trying to add database driver (JDBC): com.mckoi.JDBCDriver - Error, not in CLASSPATH?
Trying to add database driver (JDBC): org.hsqldb.jdbcDriver - Error, not in CLASSPATH?



So in the WEKA Explorer, I click on Open DB, put in my JDBC URL and username/password. When I click on Connect, it says "connecting to: jdbc:mysql://localhost:3306/analytics = true" but nothing more in the Info section.

Then when I try to close the SQL-Viewer with OK, it says "Problem connecting to database: No suitable driver found for" There's nothing else.

There's also nothing else in the weka.log

Do I need another JDBC driver even if I just want to use weka with MySQL?

404 error accuring while trying connect mysql to pentaho 5 CE

$
0
0
Good Day,

Its seems to see error 404 during connecting pentaho5 CE and mysql.

Any body please help me out to re-solve same issue

Thanks-

Chandan Aggarwal

Attribute Level at the schema xml

$
0
0
I am trying to add dimensions attributes to the Schema Workbench designer (Community Edition) but the tool doesn't allow you add attributes, only Hierarchies. I added the attributes manually as for the sample below:

<Dimension name='Customer'>
<Attributes>
<Attribute name='Country' ... />
<Attribute name='State' .../>
<Attribute name='City' .../>
</Attributes>
<Hierarchies>
<Hierarchy name='Customers'>
<Level attribute='Country'/>
<Level attribute='State'/>
<Level attribute='City'/>
</Hierarchy>
</Hierarchies>
</Dimension>

The Schema Workbench allow you to read the file but it completely eliminate everything under the <Dimension> element.
Question: The attribute element is widely documented in all Mondrian books and documentation, however it doesn't seems to be recognized with the "Community" schema workbench tool. Is this something that only exists at the "enterprise" version? If yes, editing the xml and manually adding the attribute, could be publish it and how?

Thanks!


Rey Nunez

How do I change the "c:\.pentaho" and "c:\.tonbeller" folder?

$
0
0
Hello.

When i start pentaho it create automatically this folders ".pentaho" , ".tonbeller" and ".kettle" in root of c:\ .
( Create the folders "c:\.pentaho" , "c:\.tonbeller" and "c:\.kettle" ). [Windows version]

I would like change this folders to "c:\pentaho\.pentaho", "c:\pentaho\.tonbeller" and "c:\pentaho\.kettle".

Using this manual, i fix my problem to set ".kettle" folder, but not work to set ".pentaho" and ".tonbeller" folders.
http://infocenter.pentaho.com/help/i..._home_dir.html

It´s possible change the "c:\.pentaho" and "c:\.tonbeller" folder?
What i need to do to fix it?

Thanks.

Multiple Inputs in a single transformation

$
0
0
Is it possible to combine 2 different input streams? I have a string coming out of a Java Script and a column value from a Table input step. These are completely different values without any prior connection . I need to combine them as Source1|Source2 to use in downstream steps. I tried concat,join but was not able to achieve what I need.

Drawing striplines on CCC Line Chart Components

$
0
0
Hi,

I have a line chart and I'm trying to add a vertical dotted line (a stripline) similar to the red line in this chart:

stripline.jpg

Can anyone tell me how to do this?

Thanks!
Attached Images

Cluster of stacked bars in CCC Bar Chart?

$
0
0
Hi,

I'm trying to make a bar chart showing a cluster of stacked bars, similar to the following (except, I only need to cluster two stacked bars in each group, not four):

clustered_stacked.jpg

I can make a stacked bar chart using a CCC Bar Chart component, but can't figure out how to add another in a cluster. I'm running this SQL query:

Code:

select
        date_produced
        , product
        , perc_of_weekly_demand
        , perc_of_weekly_sales
    from mytable
    order by date asc;

which produces a table like this:

date_produced product perc_of_weekly_demand perc_of_weekly_sales
3/14/14 Hat .8 .85
3/14/14 Glove .2 .15
3/21/14 Shirt .7 .6
3/21/14 Shoe .3 .4

I'd like to plot a chart where the series is 'product', the category is 'date_produced', and there are two value columns: 'perc_of_weekly_demand', and 'perc_of_weekly_sales'. This way, I'd get a cluster of stacked bars where the x-axis has ticks for the date, and within each date, there is a cluster of two stacked bars (one for perc_of_weekly_demand and one for perc_of_weekly_sales).

So far, I'm able to make a CCC Bar Chart, but I'm able to show only one stacked bar per date, not a cluster of two. I've tried playing around with the plot2 parameters in the CCC Bar Chart, but have not been able to figure out how to add a second stacked bar.

Thanks for the help!
Attached Images

Disabling cache for Pentaho

$
0
0
Hi

I have a report for which the data source is a Mondrian cube. (biserver-ce version 4.8 - Report Designer 3.9.1 - Mondrian 3.5)


In the Pentaho User console, I have noticed that whenever the underlying data for a report changes, I
do not see the changes when I rerun the report.


I have tried the following:


1) If I go to "Tools --> Refresh --> Reporting Data Cache" and rerun the report I do not see the changes.
2) If I go to "Tools --> Refresh --> Mondrian Schema Cache" and rerun the report I see the changes.
3) In the report designer I have set "Master Report --> Attributes --> pentaho --> report-cache" value to false - still no refresh of the data.
4) In the report designer I additionally set "Master Report --> Attributes --> query --> data-cache" value to false. - still no refresh of data.
5) In the "\pentaho-solutions\system\mondrian\mondrian.properties" file, I set "mondrian.rolap.star.disableCaching=" to true - still no changes reflected.
6) In "\tomcat\tomcat\webapps\pentaho\WEB-INF\classes\classic-engine.properties" file I added the line org.pentaho.reporting.engine.classic.core.cache.DataCache=


The thing that finally works for me is to do both steps 5 & 6. Note that just step 5 or 6 independently does not give me the desired rseult.

Is this the right way to do things? I have no idea what side effects these changes will have.


Is there a better way of disabling cache on a per report basis - or just disable caching of data (without disabling cache for stuff other
than data like meta data, mondrian schemas etc)


Thanks in advance
Vadi

Display paramter value as per user login

$
0
0
Hi,

I created one report in Pentaho report designer.

Report is having some parameter,

Now I want to give access of this report to user in such a way that, Some user can show all values in dropdown list while others can show only some provided.

How can i achieve this ?

can We do parameters visible , invisible based on condition ?

Pl help , Thanks in Advance !!

Problems with extracting data with jsoup

$
0
0
I'm back with another problem... after i saw the result of JTidy i tried to extract data with Jsoup, and i'm using that code to extract it:

Code:

java;

Jsoup = org.jsoup.Jsoup;
Whitelist = org.jsoup.safety.Whitelist;


doc = Jsoup.parse(html);


//doc.body().html(Jsoup.clean(doc.body().html(), Whitelist.relaxed()));


doc.head().remove();


var xhtml = doc.select("resultDataRow1").outerHtml();

I had to do the fifth line as comment or i get back a blank page... Now, the data i have to extract are these

Quote:

<div id="resultDataRow1" class="docMain">
<div class="dataCol1">
<span class="custom-checkbox">
<input name="selectedEIDs" value="2-s2.0-84884994817" onclick="return selectDeselectResult(document.SearchResultsForm, this);" id="eid_2-s2.0-84884994817" type="checkbox">
<span class="box"><span class="tick"></span></span>
</span>
<br>
<label for="eid_2-s2.0-84884994817">
<span class="hidden-label">
result
2</span>
</label>
</div>
<div class="dataCol2">
<label class="hidden-label">Document</label>
<span class="docTitle">
<a href="http://www.scopus.com/record/display.url?eid=2-s2.0-84884994817&amp;origin=resultslist&amp;sort=plf-f&amp;src=s&amp;st1=flesca&amp;sid=CB6B7BF29B4C406FC28B0A56BC998655.WXhD7YyTQ6A7Pvk9AlA%3a20&amp;sot=b&amp;sdt=b&amp;sl=19&amp;s=AUTHOR-NAME%28flesca%29&amp;relpos=1&amp;relpos=1&amp;citeCnt=0&amp;searchTerm=AUTHOR-NAME%28flesca%29" title="Show document details" onclick="javascript:submitRecord('2-s2.0-84884994817','1','0');">Efficiently estimating the probability of extensions in abstract argumentation</a>
</span>
</div>
<div class="dataCol3">
<label class="hidden-label">Authors of Document</label>
<span class="">
<a href="http://www.scopus.com/authid/detail.url?origin=resultslist&amp;authorId=15131404100&amp;zone=" title="Show author details">Fazzinga, B.</a>, <a href="http://www.scopus.com/authid/detail.url?origin=resultslist&amp;authorId=55926108000&amp;zone=" title="Show author details">Flesca, S.</a>, <a href="http://www.scopus.com/authid/detail.url?origin=resultslist&amp;authorId=36020331900&amp;zone=" title="Show author details">Parisi, F.</a>
</span>
</div>
<div class="dataCol4">
<label class="hidden-label">Year the Document was Publish</label>
<span class="">
2013
</span>
</div>
<div class="dataCol5">
<label class="hidden-label">Source of the Document</label>
<span class="">
<a href="http://www.scopus.com/source/sourceInfo.url?sourceId=25674&amp;origin=resultslist" title="Show source title details">Lecture
Notes in Computer Science (including subseries Lecture Notes in
Artificial Intelligence and Lecture Notes in Bioinformatics)</a>
</span>
</div>
<div class="dataCol6">
<label class="hidden-label">Number of Documents that reference this Document</label>
0
<br>
<span class="showCitedBy visibleHidden">Cited <br> by</span>
</div>
But i get just this as output
Quote:

"<div id=""resultDataRow0"" class=""docMain""></div>"
"<div id=""resultDataRow1"" class=""docMain""></div>"
.
.
"<div id=""resultDataRow10"" class=""docMain"">
<div class=""dataCol1""></div>
</div>"

Some suggestion about? :(

after clustering

$
0
0
hi
after I finish the clustering of data I want to do for each cluster model ..... anyone has any ideas....????

thanks

ZooKeeper available but no active master location found

$
0
0
Hi

Downloaded DI and started:
./spoon.sh
Got a nice GUI. I am trying to configure Hbase connection to Hortonworks sandbox 2.0
Screen Shot 2014-04-10 at 13.20.53.jpg

When I press Get table names I will have log:
...
INFO 10-04 12:43:47,671 - Initiating client connection, connectString=sandbox.hortonworks.com:2181 sessionTimeout=180000 watcher=hconnection
INFO 10-04 12:43:47,672 - Opening socket connection to server sandbox.hortonworks.com/0:0:0:0:0:0:0:1:2181
WARN 10-04 12:43:47,672 - Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:692)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1119)
INFO 10-04 12:43:48,467 - Opening socket connection to server sandbox.hortonworks.com/127.0.0.1:2181
INFO 10-04 12:43:48,467 - Socket connection established to sandbox.hortonworks.com/127.0.0.1:2181, initiating session
INFO 10-04 12:43:48,478 - Session establishment complete on server sandbox.hortonworks.com/127.0.0.1:2181, sessionid = 0x1453145e950005e, negotiated timeout = 40000
INFO 10-04 12:43:48,486 - ZooKeeper available but no active master location found
INFO 10-04 12:43:48,486 - getMaster attempt 0 of 10 failed; retrying after sleep of 1000
org.apache.hadoop.hbase.MasterNotRunningException
...

I tried with my own custom java code from the same machine\
1 import org.apache.hadoop.conf.Configuration;
2 import org.apache.hadoop.hbase.HBaseConfiguration;
3 import org.apache.hadoop.hbase.HColumnDescriptor;
4 import org.apache.hadoop.hbase.HTableDescriptor;
5 import org.apache.hadoop.hbase.client.HBaseAdmin;
6 import org.apache.hadoop.hbase.util.Bytes;
7
8 public class Hbase_connect {
9
10 public static void main(String[] args) throws Exception {
11 Configuration conf = HBaseConfiguration.create();
12 conf.set("hbase.zookeeper.quorum", "sandbox.hortonworks.com");
13 conf.set("hbase.zookeeper.property.clientPort", "2181");
14 conf.set("hbase.rootdir", "hdfs://sandbox.hortonworks.com:8020/apps/hbase/data");
15 conf.set("zookeeper.znode.parent", "/hbase-unsecure");
16 HBaseAdmin admin = new HBaseAdmin(conf);
17 HTableDescriptor[] tabdesc = admin.listTables();
18 for(int i=0; i<tabdesc.length; i++) {
19 System.out.println("Table = " + new String(tabdesc [i].getName()));
20 }
21 }
22 }
margusja@IRack:~/java/hbase_connect$ ls -lah libs/
total 160264
drwxr-xr-x 6 margusja staff 204B Apr 5 22:09 .
drwxr-xr-x 6 margusja staff 204B Apr 10 13:23 ..
-rw-r--r-- 1 margusja staff 2.5K Oct 7 2013 hadoop-client-2.2.0.jar
-rw-r--r-- 1 margusja staff 2.6M Apr 5 22:08 hadoop-common-2.2.0.2.0.6.0-76.jar
drwxr-xr-x 4 margusja staff 136B Apr 5 19:40 hbase-0.96.2-hadoop2
-rw-r--r-- 1 margusja staff 76M Apr 4 02:18 hbase-0.96.2-hadoop2-bin.tar.gz

margusja@IRack:~/java/hbase_connect$ java -cp ./:./libs/*:./libs/hbase-0.96.2-hadoop2/lib/* Hbase_connect
2014-04-10 13:31:56.917 java[5611:1703] Unable to load realm info from SCDynamicStore
2014-04-10 13:31:56,959 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2014-04-10 13:31:57,112 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:zookeeper.version=3.4.5-1392090, built on 09/30/2012 17:52 GMT
2014-04-10 13:31:57,112 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:host.name=10.10.5.8
2014-04-10 13:31:57,113 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.version=1.7.0_09
2014-04-10 13:31:57,113 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.vendor=Oracle Corporation
2014-04-10 13:31:57,113 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.home=/Library/Java/JavaVirtualMachines/jdk1.7.0_09.jdk/Contents/Home/jre
2014-04-10 13:31:57,113 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.class.path=./:./libs/hadoop-client-2.2.0.jar:./libs/hadoop-common-2.2.0.2.0.6.0-76.jar:./libs/hbase-0.96.2-hadoop2/lib/activation-1.1.jar:./libs/hbase-0.96.2-hadoop2/lib/aopalliance-1.0.jar:./libs/hbase-0.96.2-hadoop2/lib/asm-3.1.jar:./libs/hbase-0.96.2-hadoop2/lib/avro-1.7.4.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-beanutils-1.7.0.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-beanutils-core-1.8.0.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-cli-1.2.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-codec-1.7.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-collections-3.2.1.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-compress-1.4.1.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-configuration-1.6.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-daemon-1.0.13.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-digester-1.8.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-el-1.0.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-httpclient-3.1.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-io-2.4.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-lang-2.6.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-logging-1.1.1.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-math-2.1.jar:./libs/hbase-0.96.2-hadoop2/lib/commons-net-3.1.jar:./libs/hbase-0.96.2-hadoop2/lib/findbugs-annotations-1.3.9-1.jar:./libs/hbase-0.96.2-hadoop2/lib/gmbal-api-only-3.0.0-b023.jar:./libs/hbase-0.96.2-hadoop2/lib/grizzly-framework-2.1.2.jar:./libs/hbase-0.96.2-hadoop2/lib/grizzly-http-2.1.2.jar:./libs/hbase-0.96.2-hadoop2/lib/grizzly-http-server-2.1.2.jar:./libs/hbase-0.96.2-hadoop2/lib/grizzly-http-servlet-2.1.2.jar:./libs/hbase-0.96.2-hadoop2/lib/grizzly-rcm-2.1.2.jar:./libs/hbase-0.96.2-hadoop2/lib/guava-12.0.1.jar:./libs/hbase-0.96.2-hadoop2/lib/guice-3.0.jar:./libs/hbase-0.96.2-hadoop2/lib/guice-servlet-3.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-annotations-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-auth-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-client-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-common-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-hdfs-2.2.0-tests.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-hdfs-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-mapreduce-client-app-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-mapreduce-client-common-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-mapreduce-client-core-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-mapreduce-client-jobclient-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-mapreduce-client-shuffle-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-yarn-api-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-yarn-client-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-yarn-common-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-yarn-server-common-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hadoop-yarn-server-nodemanager-2.2.0.jar:./libs/hbase-0.96.2-hadoop2/lib/hamcrest-core-1.3.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-client-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-common-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-examples-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-hadoop-compat-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-hadoop2-compat-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-it-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-prefix-tree-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-protocol-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-server-0.96.2-hadoop2-tests.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-server-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-shell-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-testing-util-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/hbase-thrift-0.96.2-hadoop2.jar:./libs/hbase-0.96.2-hadoop2/lib/htrace-core-2.04.jar:./libs/hbase-0.96.2-hadoop2/lib/httpclient-4.1.3.jar:./libs/hbase-0.96.2-hadoop2/lib/httpcore-4.1.3.jar:./libs/hbase-0.96.2-hadoop2/lib/jackson-core-asl-1.8.8.jar:./libs/hbase-0.96.2-hadoop2/lib/jackson-jaxrs-1.8.8.jar:./libs/hbase-0.96.2-hadoop2/lib/jackson-mapper-asl-1.8.8.jar:./libs/hbase-0.96.2-hadoop2/lib/jackson-xc-1.8.8.jar:./libs/hbase-0.96.2-hadoop2/lib/jamon-runtime-2.3.1.jar:./libs/hbase-0.96.2-hadoop2/lib/jasper-compiler-5.5.23.jar:./libs/hbase-0.96.2-hadoop2/lib/jasper-runtime-5.5.23.jar:./libs/hbase-0.96.2-hadoop2/lib/javax.inject-1.jar:./libs/hbase-0.96.2-hadoop2/lib/javax.servlet-3.1.jar:./libs/hbase-0.96.2-hadoop2/lib/javax.servlet-api-3.0.1.jar:./libs/hbase-0.96.2-hadoop2/lib/jaxb-api-2.2.2.jar:./libs/hbase-0.96.2-hadoop2/lib/jaxb-impl-2.2.3-1.jar:./libs/hbase-0.96.2-hadoop2/lib/jersey-client-1.9.jar:./libs/hbase-0.96.2-hadoop2/lib/jersey-core-1.8.jar:./libs/hbase-0.96.2-hadoop2/lib/jersey-grizzly2-1.9.jar:./libs/hbase-0.96.2-hadoop2/lib/jersey-guice-1.9.jar:./libs/hbase-0.96.2-hadoop2/lib/jersey-json-1.8.jar:./libs/hbase-0.96.2-hadoop2/lib/jersey-server-1.8.jar:./libs/hbase-0.96.2-hadoop2/lib/jersey-test-framework-core-1.9.jar:./libs/hbase-0.96.2-hadoop2/lib/jersey-test-framework-grizzly2-1.9.jar:./libs/hbase-0.96.2-hadoop2/lib/jets3t-0.6.1.jar:./libs/hbase-0.96.2-hadoop2/lib/jettison-1.3.1.jar:./libs/hbase-0.96.2-hadoop2/lib/jetty-6.1.26.jar:./libs/hbase-0.96.2-hadoop2/lib/jetty-sslengine-6.1.26.jar:./libs/hbase-0.96.2-hadoop2/lib/jetty-util-6.1.26.jar:./libs/hbase-0.96.2-hadoop2/lib/jruby-complete-1.6.8.jar:./libs/hbase-0.96.2-hadoop2/lib/jsch-0.1.42.jar:./libs/hbase-0.96.2-hadoop2/lib/jsp-2.1-6.1.14.jar:./libs/hbase-0.96.2-hadoop2/lib/jsp-api-2.1-6.1.14.jar:./libs/hbase-0.96.2-hadoop2/lib/jsr305-1.3.9.jar:./libs/hbase-0.96.2-hadoop2/lib/junit-4.11.jar:./libs/hbase-0.96.2-hadoop2/lib/libthrift-0.9.0.jar:./libs/hbase-0.96.2-hadoop2/lib/log4j-1.2.17.jar:./libs/hbase-0.96.2-hadoop2/lib/management-api-3.0.0-b012.jar:./libs/hbase-0.96.2-hadoop2/lib/metrics-core-2.1.2.jar:./libs/hbase-0.96.2-hadoop2/lib/netty-3.6.6.Final.jar:./libs/hbase-0.96.2-hadoop2/lib/paranamer-2.3.jar:./libs/hbase-0.96.2-hadoop2/lib/protobuf-java-2.5.0.jar:./libs/hbase-0.96.2-hadoop2/lib/servlet-api-2.5-6.1.14.jar:./libs/hbase-0.96.2-hadoop2/lib/slf4j-api-1.6.4.jar:./libs/hbase-0.96.2-hadoop2/lib/slf4j-log4j12-1.6.4.jar:./libs/hbase-0.96.2-hadoop2/lib/snappy-java-1.0.4.1.jar:./libs/hbase-0.96.2-hadoop2/lib/xmlenc-0.52.jar:./libs/hbase-0.96.2-hadoop2/lib/xz-1.0.jar:./libs/hbase-0.96.2-hadoop2/lib/zookeeper-3.4.5.jar
2014-04-10 13:31:57,114 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.library.path=/Users/margusja/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
2014-04-10 13:31:57,115 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.io.tmpdir=/var/folders/vm/5pggdh2x3_s_l6z55brtql3h0000gn/T/
2014-04-10 13:31:57,115 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.compiler=<NA>
2014-04-10 13:31:57,115 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environments.name=Mac OS X
2014-04-10 13:31:57,115 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environments.arch=x86_64
2014-04-10 13:31:57,115 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environments.version=10.9.2
2014-04-10 13:31:57,116 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:user.name=margusja
2014-04-10 13:31:57,116 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:user.home=/Users/margusja
2014-04-10 13:31:57,116 INFO [main] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:user.dir=/Users/margusja/java/hbase_connect
2014-04-10 13:31:57,119 INFO [main] zookeeper.ZooKeeper (ZooKeeper.java:<init>(438)) - Initiating client connection, connectString=sandbox.hortonworks.com:2181 sessionTimeout=30000 watcher=hconnection-0x586fb16d, quorum=sandbox.hortonworks.com:2181, baseZNode=/hbase-unsecure
2014-04-10 13:31:57,145 INFO [main] zookeeper.RecoverableZooKeeper (RecoverableZooKeeper.java:<init>(120)) - Process identifier=hconnection-0x586fb16d connecting to ZooKeeper ensemble=sandbox.hortonworks.com:2181
2014-04-10 13:32:02,222 INFO [main-SendThread(fe80:0:0:0:0:0:0:1%1:2181)] zookeeper.ClientCnxn (ClientCnxn.java:logStartConnect(966)) - Opening socket connection to server fe80:0:0:0:0:0:0:1%1/fe80:0:0:0:0:0:0:1%1:2181. Will not attempt to authenticate using SASL (unknown error)
2014-04-10 13:32:02,229 WARN [main-SendThread(fe80:0:0:0:0:0:0:1%1:2181)] zookeeper.ClientCnxn (ClientCnxn.java:run(1089)) - Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:692)
at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:350)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1068)
2014-04-10 13:32:02,338 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn (ClientCnxn.java:logStartConnect(966)) - Opening socket connection to server localhost/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error)
2014-04-10 13:32:02,339 WARN [main-SendThread(localhost:2181)] zookeeper.ClientCnxn (ClientCnxn.java:run(1089)) - Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:692)
at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:350)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1068)
2014-04-10 13:32:02,346 WARN [main] zookeeper.RecoverableZooKeeper (RecoverableZooKeeper.java:retryOrThrow(253)) - Possibly transient ZooKeeper, quorum=sandbox.hortonworks.com:2181, exception=org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase-unsecure/hbaseid
2014-04-10 13:32:02,346 INFO [main] util.RetryCounter (RetryCounter.java:sleepUntilNextRetry(155)) - Sleeping 1000ms before retry #0...
2014-04-10 13:32:02,441 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn (ClientCnxn.java:logStartConnect(966)) - Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
2014-04-10 13:32:02,442 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn (ClientCnxn.javarimeConnection(849)) - Socket connection established to localhost/127.0.0.1:2181, initiating session
2014-04-10 13:32:02,457 INFO [main-SendThread(localhost:2181)] zookeeper.ClientCnxn (ClientCnxn.javanConnected(1207)) - Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x1453145e950006e, negotiated timeout = 30000
2014-04-10 13:32:04,340 INFO [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - hadoop.native.lib is deprecated. Instead, use io.native.lib.available
Table = ambarismoketest
Table = mytable
Table = simple_hcat_load_table
Table = users
Table = weblogs

Any hint?

Best regards, Margusja
Attached Images

Modified Java Script Value: error counting occurrences of "+" in a string

$
0
0
Hi all,
i'm trying to find the occurrences of the "+" sign in a string. I used the Modified Java Script Value, calling the getOcuranceString(string, searchFor) function in my script. But i get this error:

The function call getOcuranceString is not valid (script#7)

If i change the string to search for (i.e the "-" sign), the function works perfectly. I tried this function in both Kettle 4.4.0 and the newest 5.0.1A, without succeeding. I think this is a bug.

I've attached a sample of the job i'm trying to do, so you can fast check if i missed something or what.

Thanks in advance for the help
Attached Files

How to print page sub-totals in a group?

$
0
0
I'm sure it must be easy, but cannot find any way to print the group running sub-total on every page (but the last one) of a group spanning several pages.

I could use the page footer ... if I had any way to know which one is the last one of the group. Any way to know the running page value and the total group page? This will help also to print PageOfPages for the group; I only found this values for the report total.

Thks

How to create a "bar menu" in chart when rigth click event?

$
0
0
Hi!

I have a chart and in this chart I would like to create a menu which contains two different links to two different report.

Have you any idea?

All the files are not installed when installing CTOOLS for pentaho 5.0

$
0
0
Hi All,

I have used the following link in installing CTOOL's in pentaho 5.0.
http://pedroalves-bi.blogspot.in/201...lable-cdf.html

First i installed CTOOLS in my local instance of pentaho 5.0. Then i installed it in my production instance. When i compared them there were a lot of file's which went missing in the production instance. I m giving a snapshot which may show some missing file's.

filecompare.jpg


Can please someone help what can i do to make it work as it not really opening any dashboards.

Thanks,
Santosh Bhagavatula
Attached Images
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>