Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

pentaho mapreduce error

$
0
0
I am using Pentaho mapreduce on hdp2.3, following error was gotten, what is the matter?Thanks in advance.
2016-10-26 20:58:07,586 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved host19522 to /default-rack
2016-10-26 20:58:07,587 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1476192097407_0079_m_000000_2 TaskAttempt Transitioned from UNASSIGNED to ASSIGNED
2016-10-26 20:58:07,587 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned container container_1476192097407_0079_01_000004 to attempt_1476192097407_0079_m_000000_2
2016-10-26 20:58:07,587 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=<memory:14336, vCores:1>
2016-10-26 20:58:07,587 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 1
2016-10-26 20:58:07,587 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0 HostLocal:1 RackLocal:0
2016-10-26 20:58:07,592 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container container_1476192097407_0079_01_000004 taskAttempt attempt_1476192097407_0079_m_000000_2
2016-10-26 20:58:07,592 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Launching attempt_1476192097407_0079_m_000000_2
2016-10-26 20:58:07,592 INFO [ContainerLauncher #4] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : host19522:45454
2016-10-26 20:58:07,671 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Shuffle port returned by ContainerManager for attempt_1476192097407_0079_m_000000_2 : 13562
2016-10-26 20:58:07,672 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt: [attempt_1476192097407_0079_m_000000_2] using containerId: [container_1476192097407_0079_01_000004 on NM: [host19522:45454]
2016-10-26 20:58:07,672 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1476192097407_0079_m_000000_2 TaskAttempt Transitioned from ASSIGNED to RUNNING
2016-10-26 20:58:08,590 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1476192097407_0079: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:14336, vCores:1> knownNMs=2
2016-10-26 20:58:14,156 INFO [Socket Reader #1 for port 58314] SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for job_1476192097407_0079 (auth:SIMPLE)
2016-10-26 20:58:14,189 INFO [IPC Server handler 0 on 58314] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID : jvm_1476192097407_0079_m_000004 asked for a task
2016-10-26 20:58:14,189 INFO [IPC Server handler 0 on 58314] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID: jvm_1476192097407_0079_m_000004 given task: attempt_1476192097407_0079_m_000000_2
2016-10-26 20:58:20,594 INFO [IPC Server handler 28 on 58314] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1476192097407_0079_m_000000_2 is : 0.0
2016-10-26 20:58:20,616 FATAL [IPC Server handler 26 on 58314] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1476192097407_0079_m_000000_2 - exited : java.io.IOException: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:496)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:476)
... 7 more

2016-10-26 20:58:20,616 INFO [IPC Server handler 26 on 58314] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1476192097407_0079_m_000000_2: Error: java.io.IOException: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:496)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:476)
... 7 more

2016-10-26 20:58:20,618 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1476192097407_0079_m_000000_2: Error: java.io.IOException: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:496)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:476)
... 7 more

pentaho mapreduce error

$
0
0
I am using hdp23, when i run my pentaho mapreduce job, i got following error, what is the matter? How can I debug such errors? Thanks in advance!

2016-10-26 20:58:07,586 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved host19522 to /default-rack
2016-10-26 20:58:07,587 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1476192097407_0079_m_000000_2 TaskAttempt Transitioned from UNASSIGNED to ASSIGNED
2016-10-26 20:58:07,587 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned container container_1476192097407_0079_01_000004 to attempt_1476192097407_0079_m_000000_2
2016-10-26 20:58:07,587 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=<memory:14336, vCores:1>
2016-10-26 20:58:07,587 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 1
2016-10-26 20:58:07,587 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:1 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:3 ContRel:0 HostLocal:1 RackLocal:0
2016-10-26 20:58:07,592 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container container_1476192097407_0079_01_000004 taskAttempt attempt_1476192097407_0079_m_000000_2
2016-10-26 20:58:07,592 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Launching attempt_1476192097407_0079_m_000000_2
2016-10-26 20:58:07,592 INFO [ContainerLauncher #4] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : host19522:45454
2016-10-26 20:58:07,671 INFO [ContainerLauncher #4] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Shuffle port returned by ContainerManager for attempt_1476192097407_0079_m_000000_2 : 13562
2016-10-26 20:58:07,672 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt: [attempt_1476192097407_0079_m_000000_2] using containerId: [container_1476192097407_0079_01_000004 on NM: [host19522:45454]
2016-10-26 20:58:07,672 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1476192097407_0079_m_000000_2 TaskAttempt Transitioned from ASSIGNED to RUNNING
2016-10-26 20:58:08,590 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1476192097407_0079: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit=<memory:14336, vCores:1> knownNMs=2
2016-10-26 20:58:14,156 INFO [Socket Reader #1 for port 58314] SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for job_1476192097407_0079 (auth:SIMPLE)
2016-10-26 20:58:14,189 INFO [IPC Server handler 0 on 58314] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID : jvm_1476192097407_0079_m_000004 asked for a task
2016-10-26 20:58:14,189 INFO [IPC Server handler 0 on 58314] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID: jvm_1476192097407_0079_m_000004 given task: attempt_1476192097407_0079_m_000000_2
2016-10-26 20:58:20,594 INFO [IPC Server handler 28 on 58314] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1476192097407_0079_m_000000_2 is : 0.0
2016-10-26 20:58:20,616 FATAL [IPC Server handler 26 on 58314] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1476192097407_0079_m_000000_2 - exited : java.io.IOException: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:496)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:476)
... 7 more

2016-10-26 20:58:20,616 INFO [IPC Server handler 26 on 58314] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1476192097407_0079_m_000000_2: Error: java.io.IOException: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:496)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:476)
... 7 more

2016-10-26 20:58:20,618 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1476192097407_0079_m_000000_2: Error: java.io.IOException: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:496)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.pentaho.di.core.exception.KettleException:
Output step not defined in transformation

at org.pentaho.hadoop.mapreduce.PentahoMapRunnable.run(PentahoMapRunnable.java:476)
... 7 more

Transformation without fields?

$
0
0
Hi everybody,

I am a petaho rookie and want to call transformations from a job. Should be easy.
The first transformation doesn't work alone. There is no file given as the logging says:

2016/10/26 12:56:31 - Set Property Variables.0 - Startet die Durchfuehrung...
2016/10/26 12:56:31 - Property Input.0 - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : org.pentaho.di.core.exception.KettleException:
2016/10/26 12:56:31 - Property Input.0 - No file(s) specified! Stop processing.

Called from the job the first transformation works.
The 2nd ktr-call (Step 2) is prepared and then returns without doing anything. Doing nothing is strange. It seems to me, that the call has got no fields.
I am helpless. Am I right? And how can I change that? Can anybody help?

I attached the concerned files.
Attached Files

How to make the unit test for kettle?

Problems installing Pentaho 6.1.0.1 as a service in Windows Server 2003 32 bit

$
0
0
Hi everybody,

I'm trying to install Pentaho 6.1.0.1 as a service in a Windows Server 2003 32 bit. When I run the "service.bat install" command, the prompt command show this message: "tomcat8.exe is valid, but is for a machine type other than the current machine". I was looking for information and I saw that this message might be showed because the version of tomcat is for 64 bit instead of 32 bit. Is there any problem if I download the tomcat 8 for 32 bit and I replace the original folder of tomcat 8 with the new one in the biserver-ce folder? I don't know what are the correct steps to change the version of 64 bit of tomcat to 32 bit without corrupting the biserver-ce functionallity or if this error is showed by other cause.

Can someone help me, please? Whatever help would be very appreciated!

Thanks in advance!

How to study & configure Schema Workbench

$
0
0
Hi,
where I study (books or internet) How to use and configure Schema Workbench?

Thanks

How to hide / show line chart with clicking legend from another chart ?

$
0
0
Hello

I made an dashboard that display 5 charts. And these all 5 charts always show the same legends(series).

How to make the legendClickMode (ToggleVisible) affect (Hide/Show) all charts synchronously ?

Is there anything else to deal with this issue ?

Can we use Pentaho in commercial applications?

$
0
0
I am building a web application in jsf. I need to use a reporting tool for generating reports and exporting them to pdf or excel.

CDA/CDE misses out MDX measures

$
0
0
Hello!

I did create an OLAP Cube with the Schema Workbench and published this to the Pentaho BI Server 6.1.

My Problem:
The exact same MDX query which is perfectly working from the Schema Workbench or from the Saiku plugin does miss out some measures once I query it in CDA. There is no error, but the returned result does not contain all requested columns. This behavior also appears once you are querying unknown measures from CDA. So, I suppose CDA does not find these measure (but Saiku/SW does).
These measures are calculated once, however some calculated measures are working.

Does anybody have an idea where to start looking for errors?
Thanks for your help & kind regards
Oliver

Conditional Subreports - How to stop execution of subreport query

$
0
0
I have a main report and a subreport.
Based on the fields from the main query the subreport should only be RUN if a field is a certain value.
It appears that I can set the visible, sub-report-active and invisible-consumes-space properties to false, but the data set in the subreport still gets executes ( its a PDI source and causes an error because the data set in meaningless for the passed parameter).

I don't want to just hide the report I want it to not execute it's query at all.

Is there a way to stop a query in a subreport running at all or perhaps to force a no-data response?

or is it the case that the data set in a subreport is always run irrespective of anything?

Thanks

Slow loading of CDE

$
0
0
Hola, tengo un problema con un Dashboard de pentaho.
El problema es que la carga es muy lenta, y a la hora de mirar para ver que ralentiza la carga veo que son unos procesos doQuery uno por cada uno de las consultas que hago.
¿Como puedo ver estas consultas o acelerar la carga?
Muchas gracias.

Remote execution of jobs

$
0
0
Hello, I'm a new user of Kettle and I have a problem:
I'm doing a data migration work and I running all the transformations and jobs locally,
but the client requested a DI server and we have a question, how I can run these transformations and jobs on the server? Like develop locally and run on the server?
I tried to read about slave servers but I don't understood very well... Please give me a light on this!
http://wiki.pentaho.com/display/EAI/...cution+of+jobs

EDIT: My local computer is Windows and the server is Ubuntu Server

Not able to download jars

$
0
0
Hi,

I have an ant project which uses pentaho libraries, it was working great few days before. But since tomorrow I am not able to download the jars.
I have attached my stacktrace while building the project.

Any help will be appreciated.
Attached Files

Date conversion and lookup

$
0
0
I have av Excel file. One of the columns in it are Date datatype. I have a date dimension in which the calender_date field is a Timestamp. I want to use the Date field from the Excel document, compare it with the calender_date field in the date_dim and return the SK. I am doing this with the Dimension Lookup/Update. What should be the ideal way of looking this up?

I tried converting the date field from the Excel document to Timestamp and do the lookup, but it is returning the zero key (the first row from the date_dim, which is null). Any ideas?

Merge Join vs Stream Lookup to validate input data against data table?

$
0
0
Hi there.

I'm importing names from an Excel flat file A.

I want to check if these names are in SQL Server table B. If they are, write them to database. If they aren't, I probably want to return them to another (error) output step.

What's the best way to do this?

I can do merge join (left join I suppose to maintain non-matches) or stream lookup - & then use a filter rows to send the good to a db write, and the bad to a log type file.


Is using one preferable? Is there a more elegant way to achieve this, maybe using on-error? Just wondering. Thanks.

Sum value until next non-null data

$
0
0
Hello guys.



But I have one full data row (with number, value, delivery_date, city) and other rows with the not null value but the others null data...

Then I need sum the values from the first full row until the next non-null row.

Sum_until_next_non_null.jpg

Example: I need sum the values in the square and repeat with the underlined number and others data.

I'm sorry if there's another post about this and I'm sorry for the bad explanation.

Thanks.
Attached Images

Excel Input File issue

$
0
0
I need to read some Excel files with several tabs, that can be "very" large. With some files the Excel Input step works fine (Apache POI) but for other larger files I had to change the spread sheet type engine to Excel 2007 XLSX (Apache POI Streaming). It gets the sheets but does not find any fields. At times I get GC overhead error. Any suggestions or workaround for this issue?
I am working on 5.4 version. Do you think it will perform better on version 6?
The documentation says
"- Excel 2007 XLSX (Apache POI): If you select this spread sheet type you can read all known Excel file types. Functionality provided by the Apache POI project.
- Excel 2007 XLSX (Apache POI Streaming): This spread sheet type allows to read in large Excel files (Experimental in 5.1). "


Thanks in advance. Any suggestions would be greatly appreciated.

Put Multiple Records in an email

$
0
0
Is it possible to put multiple records in a single email.

I need a method in Kettle similar to bursting reports. (ie. for each person send and email with a table of data).
I thought of some strategies such as using row denormaliser to create a long string, but this is not idea because I don't know the max number of records that could be included.
Another strategy would be to write a file for each email and then use something like Powershell to send the email.
I understand I could zip the file and use PDI to send the email, but I don't want the contents attached as a zip.

Excel Input appends blank space ahead of "string" decimals

$
0
0
Maybe this is a bug report; I saw another couple posts on it.

Essentially I'm reading validated data from an Excel sheet - decimal form. Even the most validated Excel sheet STILL allows garbage like "stringstringstringAMERICA" if you make a cell numeric only, because this is allowed via copy/ paste.

So rather than 'ignore all errors' - I like to read these fields from the Excel sheet (that aren't protected) as string, then convert them to number when outputting or writing to a database.


However a decimal like 0.66 is converted to " 0.66" ... note the extra space at the beginning. This apparently causes an error when you wish to convert this string back to a number, unless you remove the extra space with an input mask. Not a big deal, but sounds like a bug.

Distributing Kettle transformation

$
0
0
Hello!


I did a Kettle transformation that converts data from text files to RDF output (N-triples files). However, depending on number of input files, the execution takes so much time. Is there a way to distribute the execution on Kettle?


A cluster with 4 nodes is available for me. I would like to do something like this:
If I have 20 input files, I run the transformation on 4 nodes. But each node receives only 5 input files.


I have tried clustering in Kettle as described on http://type-exit.org/adventures-with...ing-in-kettle/
But it didn't work...


I'm using PDI 4.1 because I'm using a plugin that works only in this version.
Viewing all 16689 articles
Browse latest View live