Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

The NFL and Nike continue their "Color Rush" uniforms in 2016

$
0
0
The Pittsburgh Steelers are one of the proud franchises in the NFL. They laud themselves on tradition, and an unwillingness to join the crowd when it comes to simple changes to the uniform, or things of that nature. The Steelers don't have cheerleaders, have booed the team-appointed mascot to the point where he doesn't show his face inside Heinz Field on game days and the fan base still has a fit when talking about the change from the block numbers to the italicized digits currently used on the team's uniforms.


There is a select group of teams who fall into this category of 'classic uniforms'. The Green Bay Packers, Oakland Raiders, Kansas City Chiefs and Indianapolis Colts are also organizations who certainly come to mind when thinking about 'classic' uniforms, but this could all change in 2016 -- at least for one game.


Detroit Lions Team President Rod Wood recently spilled the beans on the fact every team who plays a Thursday Night Football game will be wearing the new NFL and Nike "Color Rush" uniform. So, who plays on Thursday Nights in 2016? Well, the entire NFL. Every team is given at least one Thursday Night Football game in a season, which means even those 'classic' teams will be sporting some new duds in 2016, if this turns out to be true.


Here is what the uniforms looked like last year:


Lions president Rod Wood says all teams playing on Thursday will wear a color rush uniform.
http://adiandmala.com/


Some weren't bad, the Dallas Cowboys and Rams, while others, the Jacksonville Jaguars, were absolutely atrocious.


What would the Steelers' color rush uniforms look like, if they indeed have to wear them in 2016? Our friends at The Steelers Wire, a USA Today website, came up with the following mock ups for the team.



Not horrible, but certainly not what fans are accustomed to. Some will welcome the change, while others will complain and tell young children to get off their lawn. Either way, if Wood's comments are true, every NFL team will be using some new uniforms in 2016.

Sort rows step doesn't work real-time

$
0
0
Hi everyone,

I am using Sort rows step to sort my data according to some fields in data.
The Sort rows step get data from previous step (in my cases is Filter rows step).
My problem is that the Sort rows step doesn't work real-time.
That's mean, the Sort rows step only do sort data after the previous step finished.
I want to the Sort rows step do real-time sorting for my real-time dashboard application.
Please help me to solve this problem.

Many thanks,
Hoang Anh.

How to use "User Defined Java Class" to achieve webservice function

$
0
0
I have searched a lot,but still not find any related information.
I need achieve webservice in the UDJC with my own code. because the function of webservice need self-defined class parameter.

How to get top N row data

$
0
0
Hi everyone,

After sorting data, I want to get top N row in my data.
What step do I need to use to do this?
Sorry for my bad english.

Many thanks,
Hoang Anh.

Fail to Create dashboard use Kettle data source

$
0
0
Hi everyone,

I am trying to create dashboard using Kettle data source.
But when i run, it show an error as following: Error processing component (temperature_bar).
Please help me to solve this problem.
Sorry for my bad english

Many thanks,
Hoang Anh.

Fail to use Kettle Data source in my CDE dashboard

$
0
0
Hi everyone,

I am using apache kafka consumer to read message from broker.
After that, i add some step to process above messages to get desired result.
I save all step to a transformation file and import it to CDE dashboard as a data source using "Kettle over kettleTransFromFile".
But when i run the dashboard, it show an error as following: Error processing component (temperature_bar).
temperature_bar is a CCC bar chart component in my dashboard. It use the above Kettle tranformation file as data source.
I know the Kettle data source cause the problem but can't figure out how to fix it.
Please help me to solve this problem.
Sorry for my bad english

Many thanks,
Hoang Anh.

Fail to use Kettle Data source in my CDE dashboard

$
0
0
Hi everyone,


I am using apache kafka consumer to read message from broker.
After that, i add some step to process above messages to get desired result.
I save all step to a transformation file and import it to CDE dashboard as a data source using "Kettle over kettleTransFromFile".
But when i run the dashboard, it show an error as following: Error processing component (temperature_bar).
temperature_bar is a CCC bar chart component in my dashboard. It use the above Kettle tranformation file as data source.
I know the Kettle data source cause the problem but can't figure out how to fix it.
Please help me to solve this problem.
Sorry for my bad english


Many thanks,
Hoang Anh.

Fail to use Kettle Data source in my CDE dashboard

$
0
0
Hi everyone,


I am using apache kafka consumer to read message from broker.
After that, i add some step to process above messages to get desired result.
I save all step to a transformation file and import it to CDE dashboard as a data source using "Kettle over kettleTransFromFile".
But when i run the dashboard, it show an error as following: Error processing component (temperature_bar).
temperature_bar is a CCC bar chart component in my dashboard. It use the above Kettle tranformation file as data source.
I know the Kettle data source cause the problem but can't figure out how to fix it.
Please help me to solve this problem.
Sorry for my bad english


Many thanks,
Hoang Anh.

Approach to solve - Late Arriving Dimensions

$
0
0
Hi All,

I am using kettle 5.1.0 to load my data into my data warehouse.
In my current project we are retrieving Dimensions & Fact information from different data sources/services.
So we are getting some data issues with Late arriving dimensions (we get the fact data before the dimension record has been inserted/available).
For dimension i am using 'Dimension Lookup/update' step and its working good. even if some dimension data is missing in current insert, it will loaded in further runs. but for facts i am incrementally inserting the data using table input. so once i ignore the facts that doesn't have appropriate dimension data(due to late arrival) it will be reinserted again. so i want to know,
1. what are the approaches we can follow to fix the issues with late arriving dimensions.
2. what are the steps we can follow to improve the performance of kettle table output insertion in general? (i feel the insertion is taking more time than usual)

Regards,
Lourdhu.A

asking about the normalization data

How to view error log in Pentaho CDE Dashboard

$
0
0
Hi everyone,

I fail to use pentaho cde to create dashboard.
I want to view error log to get more information and can fix my bug.
But i can't figure out how to view error log.
Could anyone please help me to figure out.
Sorry for my bad English.

Many thanks in advance,
Hoang Anh.

Blank after installed CDE

$
0
0
Hey ,guys
i have a problem with cde blank , i installed cde stable version on biserver 4.8 . After installed , i can't see the platform of CDE ,neither Components panel nor anything . and then i tried to click save and new button , but it doesn't work .

i have reinstalled trunck version and the same issue encounter. checked the server log , but there are no error logs .

My server logger:
2016-06-13 18:14:40,836 WARN [org.pentaho.hadoop.shim.HadoopConfigurationLocator] Unable to load Hadoop Configuration from "file:///D:/Pentaho/BiServer/biserver-ce-4.8.0-stable/biserver-ce/pentaho-solutions/system/kettle/plugins/pentaho-big-data-plugin/hadoop-configurations/mapr". For more information enable debug logging.2016-06-13 18:14:41,606 WARN [org.pentaho.reporting.libraries.base.boot.PackageManager] Unresolved dependency for package: org.pentaho.reporting.engine.classic.extensions.datasources.cda.CdaModule
2016-06-13 18:14:41,626 WARN [org.pentaho.reporting.libraries.base.boot.PackageSorter] A dependent module was not found in the list of known modules.
2016-06-13 18:14:45,026 WARN [org.apache.commons.httpclient.HttpMethodBase] Going to buffer response body of large or unknown size. Using getResponseBodyAsStream instead is recommended.
2016-06-13 18:14:45,136 WARN [org.apache.axis2.description.java2wsdl.DefaultSchemaGenerator] We don't support method overloading. Ignoring [public java.lang.String serializeModels(org.pentaho.metadata.model.Domain,java.lang.String,boolean) throws java.lang.Exception]




i will be appreciate if any advice .

How to capture changed data from Oracle to Mysql

$
0
0
Hi Guys,

I am trying to move our Adhoc reporting from production to mysql.

I am using Pentaho Kettle 4.3.0 for copying data from our Oracle production to mysql.

I have created a transformation job which pulls list of tables from Oracle to mysql. However, am not able to implement Change Data Capture based on Timestamp properly.

I have created a job which has list of tables and and have set repeat interval as 6 hours so that in every 6 hours the job runs those list of transformation where the input tables has below query :

select * from transactions where timestamp>=sysdate-6/24

I thought it will copy the changed records from last 6 hours from Oracle to mysql. But the issues here is jobs run sequentially and by that time the job reaches from first table to last table there are many insertions in source table which are not getting properly inserted in target.

It creates duplicate records in target if there are updates in source.

Then i thought to use table output to truncate first and insert full table everytime, but this takes a lot of time also to insert parallelly to each tables i need to create N number of jobs which is practically not feasible.

Can you guys suggest something which i can implement.

Thanks in advance.

Regards,
Pritiranjan Khilar

Unable to use IIF function in MDX

$
0
0
Hi ,

I am new on MDX.I have written a query as mentioned below:
with SET [kpi_study] As
{[study].[study].[BHC June12]}
SET [geographic] as
{[territory.market_hierarchy].[state].[HP]}
SET [brand] As
{[brand.brand_hierarchy].[brand].[Gold Flake (Unspecified)]}
SET [edu12] As
IIF('All'='All',
[education].[education].members,
[education].[education].[All]
)
SELECT
NON EMPTY {[Measures].[tom]} ON COLUMNS
FROM [funnel_analysis]
where {[kpi_study]*[geographic]*[brand]}

Result:Tom:4.19

Instead of All if i am passing(SSC/HSC) any other value always getting same value.Can any help me where i am doing wrong.it means that IIF function is not working proper.

Postgres Mapping Data Type


Issue in Execute Row SQL Script step when used on File based approach

$
0
0
Hi Forum Members

The Transformation fails while executing sqls (Insert statement) from File using Execute Row SQL Script step.
However if the same sql statement or multiple sql statements are used with the field based approach in Execute Row SQL Script step, it works fine.

So, the issue occurs when using the file based approach. Can someone let me know, what goes wrong here or is it a bug in PDI ?

I have attached the files required (also attached the log file) and detailed the steps followed for testing and replicating the issue.

Steps to Reproduce the issue:

Create the table structure in a database table (I have used Postgres DB),

create table execute_row_sql_script_table (
id integer,
another_id integer,
yet_another_id integer
)

download the sql file to the file system (Attached), then configure the Generate Rows step to refer the path of the downloaded sql file, then execute the attached transformation.

Thanks in advance for your time!!!

How to make line chart with multiple lines from the same database

Missing plugins found while loading a Kettle transformation to pentaho report designe

$
0
0
Hi everyone,


I created a Kettle transformation which have a Apache Kafka Consumer step and run successfully.
After that, I want to import that transformation to pentaho report designer as a data source for my report.
But, when I import, there is an error as following:
org.pentaho.di.core.exception.KettleMissingPluginsException:
Missing plugins found while loading a transformation

Step : KafkaConsumer


I had pentaho-kafka-consumer plugin, but i don't know how to add that plugin to pentaho report designer.


Could anyone please help me to solve this problem.
Sorry for my bad english.


Many thanks,
Hoang Anh.

Pentaho Initialization Exception

$
0
0
I was in Eclipse deployed pentaho,

In My browser the error
Pentaho Initialization Exception

The following errors were detected
[zh_CN_49] One or more system listeners failed. These are set in the systemListeners.xml.
org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - 在试图执行启动sequenceorg.pentaho.platform.osgi.OSGIBoot发生错误

Please see the server console for more details on each error detected.

Eclipse Console logs

六月 14, 2016 4:01:23 下午 org.apache.catalina.core.ApplicationContext log
信息: No Spring WebApplicationInitializer types detected on classpath
[Server@6539448b]: [Thread[localhost-startStop-1,5,main]]: checkRunning(false) entered
[Server@6539448b]: [Thread[localhost-startStop-1,5,main]]: checkRunning(false) exited
[Server@6539448b]: Initiating startup sequence...
[Server@6539448b]: Server socket opened successfully in 1 ms.
[Server@6539448b]: Database [index=0, id=0, db=file:../../data/hsqldb/quartz, alias=quartz] opened sucessfully in 224 ms.
[Server@6539448b]: Database [index=1, id=1, db=file:../../data/hsqldb/hibernate, alias=hibernate] opened sucessfully in 34 ms.
[Server@6539448b]: Database [index=2, id=2, db=file:../../data/hsqldb/sampledata, alias=sampledata] opened sucessfully in 26 ms.
[Server@6539448b]: Startup sequence completed in 287 ms.
[Server@6539448b]: 2016-06-14 16:01:24.196 HSQLDB server 2.3.2 is online on port 9001
[Server@6539448b]: To close normally, connect and execute SHUTDOWN SQL
[Server@6539448b]: From command line, use [Ctrl]+[C] to abort abruptly
六月 14, 2016 4:01:24 下午 org.apache.catalina.core.ApplicationContext log
信息: Initializing Spring root WebApplicationContext
Pentaho BI Platform server 没有正确地初始化. 系统将无法工作.
六月 14, 2016 4:01:25 下午 org.apache.tomcat.util.descriptor.web.SecurityConstraint findUncoveredHttpMethods
严重: For security constraints with URL pattern [/jsp/*] only the HTTP methods [POST GET] are covered. All other methods are uncovered.
六月 14, 2016 4:01:35 下午 org.apache.catalina.core.StandardContext loadOnStartup
严重: Servlet [proxy] in web application [/pentaho] threw load() exception
javax.servlet.ServletException: Bundle context attribute [org.osgi.framework.BundleContext] not set in servlet context
at org.apache.felix.http.proxy.ProxyServlet.getBundleContext(ProxyServlet.java:81)
at org.apache.felix.http.proxy.ProxyServlet.doInit(ProxyServlet.java:50)
at org.apache.felix.http.proxy.ProxyServlet.init(ProxyServlet.java:39)
at org.apache.catalina.core.StandardWrapper.initServlet(StandardWrapper.java:1231)
at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1144)
at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:1031)
at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4914)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5201)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:725)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:701)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1101)
at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1786)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
六月 14, 2016 4:01:36 下午 org.apache.catalina.startup.HostConfig deployDirectory
信息: Deployment of web application directory D:\java\tomcat-8.0.33\webapps\pentaho has finished in 22,837 ms
六月 14, 2016 4:01:36 下午 org.apache.catalina.startup.HostConfig deployDirectory
信息: Deploying web application directory D:\java\tomcat-8.0.33\webapps\pentaho-style
六月 14, 2016 4:01:36 下午 org.apache.jasper.servlet.TldScanner scanJars
信息: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
六月 14, 2016 4:01:36 下午 org.apache.catalina.startup.HostConfig deployDirectory
信息: Deployment of web application directory D:\java\tomcat-8.0.33\webapps\pentaho-style has finished in 166 ms
六月 14, 2016 4:01:36 下午 org.apache.catalina.startup.HostConfig deployDirectory
信息: Deploying web application directory D:\java\tomcat-8.0.33\webapps\ROOT
六月 14, 2016 4:01:36 下午 org.apache.jasper.servlet.TldScanner scanJars
信息: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
六月 14, 2016 4:01:36 下午 org.apache.catalina.startup.HostConfig deployDirectory
信息: Deployment of web application directory D:\java\tomcat-8.0.33\webapps\ROOT has finished in 170 ms
六月 14, 2016 4:01:36 下午 org.apache.catalina.startup.HostConfig deployDirectory
信息: Deploying web application directory D:\java\tomcat-8.0.33\webapps\sw-style
六月 14, 2016 4:01:36 下午 org.apache.jasper.servlet.TldScanner scanJars
信息: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
六月 14, 2016 4:01:36 下午 org.apache.catalina.startup.HostConfig deployDirectory
信息: Deployment of web application directory D:\java\tomcat-8.0.33\webapps\sw-style has finished in 161 ms
六月 14, 2016 4:01:36 下午 org.apache.coyote.AbstractProtocol start
信息: Starting ProtocolHandler ["http-apr-9999"]
六月 14, 2016 4:01:36 下午 org.apache.coyote.AbstractProtocol start
信息: Starting ProtocolHandler ["ajp-apr-8009"]
六月 14, 2016 4:01:36 下午 org.apache.catalina.startup.Catalina start
信息: Server startup in 25448 ms
My change tomcat/web.xml
<context-param>
<param-name>solution-path</param-name>
<param-value>D:\pentaho-platform-6.0.1.0\assembly\package-res\biserver\pentaho-solutions</param-value> <!--this is my pentaho-solutions path -->
</context-param>

Modified Java Script Value behaving different in Spoon and kitchen

$
0
0
I have a Modified Java Script Value step which looks like this:

//Script here


if (Connectiontime <= 15)
{


var belowfifteen = 1


}


else


{


var belowfifteen = 0


}



It works well when I run the transformation in Spoon. When I run it in kitchen via command line, the variable "belowfifteen" is allways 0.

Can anyone help me please?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>