Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

BI Server 5.3 and Active Directory

$
0
0
Hi, I use BI Server 5.3 and trying to set up authentication through Active Directory, but it does not work.
I use instructions https://help.pentaho.com/Documentation/5.3/0P0/0W0/004

Catalina log:

tail -f tomcat/logs/catalina.out
Apr 02, 2015 3:01:20 PM org.apache.jk.server.JkMain start
INFO: Jk running ID=0 time=0/15 config=null
Apr 02, 2015 3:01:20 PM org.apache.catalina.startup.Catalina start
INFO: Server startup in 4965 ms
Apr 02, 2015 3:03:16 PM org.apache.coyote.http11.Http11Protocol pause
INFO: Pausing Coyote HTTP/1.1 on http-8080
Apr 02, 2015 3:03:17 PM org.apache.catalina.core.StandardService stop
INFO: Stopping service Catalina
Apr 02, 2015 3:03:17 PM org.apache.coyote.http11.Http11Protocol destroy
INFO: Stopping Coyote HTTP/1.1 on http-8080
Apr 02, 2015 3:03:32 PM org.apache.catalina.core.AprLifecycleListener init
INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib
Apr 02, 2015 3:03:32 PM org.apache.coyote.http11.Http11Protocol init
INFO: Initializing Coyote HTTP/1.1 on http-8080
Apr 02, 2015 3:03:32 PM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 545 ms
Apr 02, 2015 3:03:32 PM org.apache.catalina.core.StandardService start
INFO: Starting service Catalina
Apr 02, 2015 3:03:32 PM org.apache.catalina.core.StandardEngine start
INFO: Starting Servlet Engine: Apache Tomcat/6.0.41
Apr 02, 2015 3:03:32 PM org.apache.catalina.startup.HostConfig deployDescriptor
INFO: Deploying configuration descriptor pentaho.xml
log4j:WARN Continuable parsing error 187 and column 23
log4j:WARN The content of element type "log4j:configuration" must match "(renderer*,throwableRenderer?,appender*,plugin*,(category|logger)*,root?,(categoryFactory|loggerFactory)?)".
[Server@46ae10a6]: [Thread[main,5,main]]: checkRunning(false) entered
[Server@46ae10a6]: [Thread[main,5,main]]: checkRunning(false) exited
[Server@46ae10a6]: Initiating startup sequence...
[Server@46ae10a6]: Server socket opened successfully in 1 ms.
[Server@46ae10a6]: Database [index=0, id=0, db=file:../../data/hsqldb/sampledata, alias=sampledata] opened sucessfully in 1381 ms.
[Server@46ae10a6]: Database [index=1, id=1, db=file:../../data/hsqldb/hibernate, alias=hibernate] opened sucessfully in 19 ms.
[Server@46ae10a6]: Database [index=2, id=2, db=file:../../data/hsqldb/quartz, alias=quartz] opened sucessfully in 26 ms.
[Server@46ae10a6]: Startup sequence completed in 1429 ms.
[Server@46ae10a6]: 2015-04-02 15:03:35.971 HSQLDB server 1.8.0 is online
[Server@46ae10a6]: To close normally, connect and execute SHUTDOWN SQL
[Server@46ae10a6]: From command line, use [Ctrl]+[C] to abort abruptly
Pentaho BI Platform server is ready. (Pentaho Open Source BA Server 5.3.0.0-213) Fully Qualified Server Url = http://localhost:8080/pentaho/, Solution Path = /opt/biserver.5.3.postgresql/pentaho-solutions
15:04:02,581 ERROR [CteDefaultProviderManager] Provider ID is blacklisted: sparkl. Discarding it..
Apr 02, 2015 3:04:03 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory sw-style
Apr 02, 2015 3:04:03 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory ROOT
Apr 02, 2015 3:04:03 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory pentaho-style
Apr 02, 2015 3:04:03 PM org.apache.coyote.http11.Http11Protocol start
INFO: Starting Coyote HTTP/1.1 on http-8080
Apr 02, 2015 3:04:03 PM org.apache.jk.common.ChannelSocket init
INFO: JK: ajp13 listening on /0.0.0.0:8009
Apr 02, 2015 3:04:03 PM org.apache.jk.server.JkMain start
INFO: Jk running ID=0 time=0/29 config=null
Apr 02, 2015 3:04:03 PM org.apache.catalina.startup.Catalina start
INFO: Server startup in 30948 ms
Attempting to load ESAPI.properties via file I/O.
Attempting to load ESAPI.properties as resource file via file I/O.
Not found in 'org.owasp.esapi.resources' directory or file not readable: /opt/biserver.5.3.postgresql/tomcat/bin/ESAPI.properties
Not found in SystemResource Directory/resourceDirectory: .esapi/ESAPI.properties
Not found in 'user.home' (/home/pentaho) directory: /home/pentaho/esapi/ESAPI.properties
Loading ESAPI.properties via file I/O failed. Exception was: java.io.FileNotFoundException
Attempting to load ESAPI.properties via the classpath.
SUCCESSFULLY LOADED ESAPI.properties via the CLASSPATH from '/ (root)' using current thread context class loader!
SecurityConfiguration for Validator.ConfigurationFile not found in ESAPI.properties. Using default: validation.properties
Attempting to load validation.properties via file I/O.
Attempting to load validation.properties as resource file via file I/O.
Not found in 'org.owasp.esapi.resources' directory or file not readable: /opt/biserver.5.3.postgresql/tomcat/bin/validation.properties
Not found in SystemResource Directory/resourceDirectory: .esapi/validation.properties
Not found in 'user.home' (/home/pentaho) directory: /home/pentaho/esapi/validation.properties
Loading validation.properties via file I/O failed.
Attempting to load validation.properties via the classpath.
validation.properties could not be loaded by any means. fail. Exception was: java.lang.IllegalArgumentException: Failed to load ESAPI.properties as a classloader resource.

help me please.

spoon core dumped

$
0
0
HI all,

I am new to pentaho.Curently i have configured and worked on it.While trying to put data into a database my pentaho stucked and closed i am getting the following error in my terminal. could any one suggest


java: cairo-misc.c:380: _cairo_operator_bounded_by_source: Assertion `NOT_REACHED' failed../spoon.sh: line 205: 19924 Aborted (core dumped) "$_PENTAHO_JAVA" $OPT -jar "$STARTUP" -lib $LIBPATH "${1+$@}"




Thanks & Regards
Amithsha

Using Maps in Pentaho Report Designer

$
0
0
Hello Friends,

I wanted to create a prpt report containing maps to display location-wise data. I read in a few threads that it is possible in PRD but could not find any tutorials or samples to include geomaps in a report. Can anyone help me by providing links to any tutorials for it or by listing the steps to achieve it?

Thanks in advance,
Shashank

Hide and show charts dynamically

$
0
0
Please, could anyone advise me a proper way to hide/show charts (in particular, Table Component) dynamically from code?
For example, from clickAction of another component I do the next (trying to hide table component named serviceCallCount, sitting in html object named 'r2'):
Code:

...
var tmp = Dashboards.getComponentByName('render_serviceCallCount');
alert(tmp.htmlObject); // r2
tmp.htmlObject = null;
alert(tmp.htmlObject); // null
Dashboards.fireChange('isSelected'); // serviceCallCount is listening for this param
alert(tmp.htmlObject); // null
tmp.update(); // one more!
alert(tmp.htmlObject); // null
...

but serviceCallCount remains visible on my dashboard.
Setting tmp.visible to false instead of nullifying tmp.htmlObject, hides tabel for short time - when the chart is in Post Fetch function, but before and after, it remains visible in spite of visible = false.

S3Output plugin missing in pdi-ce-5.3.0.0-213

$
0
0
Hi,

I am running simple S3 file upload step using pdi-ce-5.3.0.0-213.

It's working fine on Windows7 64 bit. But throwing Missing Plugin Exception on AWS Amazon Linux AMI ec2 server.

Please help... ?

Attached are the pentaho files which i ran on ec2 using command /home/ec2-user/pdi-ce-5.3.0.0-213/data-integration/kitchen.sh -file:"/home/ec2-user/poc/tests3.kjb" -level:Basic

Here are the logs of s3 file upload job run:

2015/04/02 17:36:47 - Kitchen - Logging is at level : Basic logging
2015/04/02 17:36:47 - Kitchen - Start of run.
2015/04/02 17:36:49 - tests3 - Start of job execution
2015/04/02 17:36:49 - tests3 - Starting entry [Transformation]
2015/04/02 17:36:49 - Transformation - Loading transformation from XML file [file:///home/ec2-user/poc/tests3.ktr]
2015/04/02 17:36:49 - tests3 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleException:
2015/04/02 17:36:49 - tests3 - Unexpected error during transformation metadata load
2015/04/02 17:36:49 - tests3 -
2015/04/02 17:36:49 - tests3 - Missing plugins found while loading a transformation
2015/04/02 17:36:49 - tests3 -
2015/04/02 17:36:49 - tests3 - Step : S3FileOutputPlugin
2015/04/02 17:36:49 - tests3 -
2015/04/02 17:36:49 - tests3 - at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1205)
2015/04/02 17:36:49 - tests3 - at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:648)
2015/04/02 17:36:49 - tests3 - at org.pentaho.di.job.Job.execute(Job.java:716)
2015/04/02 17:36:49 - tests3 - at org.pentaho.di.job.Job.execute(Job.java:859)
2015/04/02 17:36:49 - tests3 - at org.pentaho.di.job.Job.execute(Job.java:532)
2015/04/02 17:36:49 - tests3 - at org.pentaho.di.job.Job.run(Job.java:424)
2015/04/02 17:36:49 - tests3 - Caused by: org.pentaho.di.core.exception.KettleMissingPluginsException:
2015/04/02 17:36:49 - tests3 - Missing plugins found while loading a transformation
2015/04/02 17:36:49 - tests3 -
2015/04/02 17:36:49 - tests3 - Step : S3FileOutputPlugin
2015/04/02 17:36:49 - tests3 - at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:2835)
2015/04/02 17:36:49 - tests3 - at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2671)
2015/04/02 17:36:49 - tests3 - at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2623)
2015/04/02 17:36:49 - tests3 - at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2600)
2015/04/02 17:36:49 - tests3 - at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1154)
2015/04/02 17:36:49 - tests3 - ... 5 more
2015/04/02 17:36:49 - tests3 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : A serious error occurred during job execution:
2015/04/02 17:36:49 - tests3 - Unexpected error occurred while launching entry [Transformation.0]
2015/04/02 17:36:49 - tests3 -

Compare Two Strings from different columns

$
0
0
Hi,

I have 4 columns
i want to compare col1 data with col2,col3,col4 if its found it should give me value to col5 ie for col2 value return 2,col3 value return 3,col4 return 4
col1 col2 col3 col4
aa ss,dd aa,bab ee,ss
vv vv,ee rr,ii,pp ee,ww
ds rr,ee, ww,xs ds,ee,



My output should be :

col1 col2 col3 col4 col5
aa ss,dd aa,bab ee,ss 2
vv vv,ee rr,ii,pp ee,ww 3
ds rr,ee, ww,xs ds,ee 4


Can anyone please help me with sample transformation or a code.

Bringing accented and other LATIN CHARACTERS

$
0
0
Hello, guys!

I've been surfing the web for the last two weeks trying to make this happen: I need to bring special latin characters like ñ, á, and others from an SQL Script to Pentaho trough a Table Input (Teradata is the SQL Engine I'm calling from Pentaho).

My problem is that I have not been able to bring the data avoiding this symbol in each "special" character �. I have already changed the D_File.Encoding to UTF8/iso-8859-1 both in the .bat and the .sh file and still the solution hasn't worked out.

I really don't know what to do and I am close to giving up, so I decided to see if you could help me find a solution.

Thanks for all.

LDAP Output Update Field - error with Dn fieldname

$
0
0
I'm trying to update a ldap field but I got the error message bellow in the transformation logs:

Code:

015/04/02 19:24:31 - LDAP People Output.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleException:
2015/04/02 19:24:31 - LDAP People Output.0 - Error updating entry with dn = [aa]!
2015/04/02 19:24:31 - LDAP People Output.0 - aa: [LDAP: error code 34 - Invalid DN Syntax]
2015/04/02 19:24:31 - LDAP People Output.0 -
2015/04/02 19:24:31 - LDAP People Output.0 -    at org.pentaho.di.trans.steps.ldapinput.LDAPConnection.update(LDAPConnection.java:331)
.....
2015/04/02 19:24:31 - LDAP People Output.0 - Caused by: javax.naming.InvalidNameException: aa: [LDAP: error code 34 - Invalid DN Syntax]; remaining name 'aa'
2015/04/02 19:24:31 - LDAP People Output.0 -    at com.sun.jndi.ldap.LdapCtx.processReturnCode(LdapCtx.java:3025)
2015/04/02 19:24:31 - LDAP People Output.0 -    at com.sun.jndi.ldap.LdapCtx.processReturnCode(LdapCtx.java:2840)
2015/04/02 19:24:31 - LDAP People Output.0 -    at com.sun.jndi.ldap.LdapCtx.c_modifyAttributes(LdapCtx.java:1478)
2015/04/02 19:24:31 - LDAP People Output.0 -    at com.sun.jndi.toolkit.ctx.ComponentDirContext.p_modifyAttributes(ComponentDirContext.java:273)
2015/04/02 19:24:31 - LDAP People Output.0 -    at com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.modifyAttributes(PartialCompositeDirContext.java:190)
2015/04/02 19:24:31 - LDAP People Output.0 -    at com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.modifyAttributes(PartialCompositeDirContext.java:179)
2015/04/02 19:24:31 - LDAP People Output.0 -    at javax.naming.directory.InitialDirContext.modifyAttributes(InitialDirContext.java:167)
2015/04/02 19:24:31 - LDAP People Output.0 -    at org.pentaho.di.trans.steps.ldapinput.LDAPConnection.update(LDAPConnection.java:321)
2015/04/02 19:24:31 - LDAP People Output.0 -    ... 3 more

Is available a list of previous fields retrieve with success and I need to update the field nsuniqueid with the value aa

The search dn for the object is:

Code:

ou=people, globalid=00000000000000000000, ou=acme, o=tivol
And the object that contains the nsuniqueid field is (the dn) is

Code:

id=1234567, ou=0, ou=people, globalid=00000000000000000000, ou=acme, o=tivol
Kettle receives the dn fieldname as is showed in the log: dn = [aa]!

LdapOutput-UpdateField.png

LdapOutput-UpdateField2.png

LdapOutput-UpdateField3.png

I tried to find the documentation, screenshots and examples about this component but unfortunatly I didn't found any
information that solves my problem.

I guess that I'm only need to inform the correct syntax for the field to get the desirable result.

Thanks for your help.

jtds and mssql linked servers

$
0
0
Hope I'm missing something...

Using JTDS connection to SQLServer 2008 server "Server B". "Server A" is a linked server to "Server B".

I can use a Table Input step to select from a table in a database on Server A using a connection to a database on server B. Query (against server B) looks something like "select * from [Server A].database.dbo.tablename". This is good.

I would like to use a Table Output step to insert to the same table, connecting the same way ..That is, to insert to table on Server A using a connection to Server B. This does not seem to work. I had hoped changing the table name in the Table Output step to "[Server A].database.dbo.tablename" would do it. Any thoughts? Thanks!

Deleted records from Source Database

$
0
0
Hi,

I am using PDI to transfer daily data from Oracle to MySQL for at least 400 tables.

for that purpose, I am using Table Input/Insert Update step.

Now at Oracle Side records are deleted in some tables frequently.

I am using Merge-Row diff and Synchronize After Merge step in case of deletion,but it taking time and hard to run it daily.

Is there any other method that can help me to improve performance issue or make it me simple ??

How to run job using Carte in shell Script

$
0
0
Hi,

I am using Shell Script to run Transformation and Jobs using pan.sh and kitchen.sh.

Now I want to use carte.sh in shell script to run both transformation and jobs.

For Example :- for transformation i am using pan.sh as below :-

v_jobfile=/pentaho/spoon/data-integration/production_reports/Demo.ktr

/pentaho/spoon/data-integration/pan.sh \
-file="$v_jobfile" -param:v1="$1" -level=Minimal > $OUTDIR
if [ $? -eq 0 ];
then
echo "The Program Completed Successfully :- Demo.ktr" >> $OUTDIR
echo $'\n' >> $OUTDIR
fi


so how to write code in shell script to run carte in it ?

How to check total records in Source and Destination Database

$
0
0
Hi,

I am using PDI to transfer data from Oracle to MySQL daily for at least 400 tables.

I am using Table Input/Insert Update step for it.

Now I want to check total records from both tables, manually I have to do Count * for both database, which method is very frustrating.

Is there any method in PDI, that i can do dynamically.

Pentaho Return Status Codes

$
0
0
Hi,
From Pentaho Info center, When you run Kitchen, By default it returns one of different possible return codes (0, 1, 2, 7, 8, 9) that indicate the result of the operation. Is there any way to forcefully return these (or self generated) codes using any Pentaho Job/Transformation step ?

My purpose to do this is to handle some exceptions in Pentaho and then throw the status/return code by choice.

How to get the logfile name in the transaction.

$
0
0
Hi All,

I used the /logfile special the log file.

How i can get the log file full name in the transformation. The kettle version is 5.0.1 CE

notes.
I see the source code in kitchen.java

if ( !Const.isEmpty( optionLogfile ) ) {
fileAppender = new FileLoggingEventListener( optionLogfile.toString(), true );
KettleLogStore.getAppender().addLoggingEventListener( fileAppender );
} else {
fileAppender = null;
}

But I dont know how to get the filename in transformation.

BA server 1st time start failure (MySQL repository)

$
0
0
On RedHat Linux, starting using start script start-pentaho.sh

tomcat container comes up, but pentaho app fails but not totally, meaning I get jackrabbit tables created and see trace of connect/queries being run, e.g.,

MySQL general log (attached mysqldg.log) shows connection made and queries run against FS tables, last being
select BUNDLE_DATA from PM_VER_BUNDLE where NODE_ID,...

But pentaho.log attached) indicates issue with connection refused during init.

I have tested with JDBC test program I can connect using jcr_user with same host, password, port information.

Any additional trace I can turn no ?

MySQL and Pentaho BA server are on same Linux box, Amazon cloud server.
Attached Files

Problem create¡ing a data source to MySQL in Pentaho User Console

$
0
0
Hi,

I am working with Pentaho B.A platform.

I want to create a data source from the Pentaho User Console. It's a data source to a MySQL database.

I have to select 'Generic Database' (there is not a specific DBMS for MySQL) and Native(JDBC)

My connection info:


URL:jdbc:mysql://localhost:3306/database_name
driver class: com.mysql.jdbc.Driver

When I make a test I get this error:

ConnectionServicelmpl.ERROR_0009 - Connection to database [null ] failed



I don't know if the problem is in the mysql driver, then I hace copied mysql-connector-java-5.1.34-b jar to:


C:\Pentaho\server\biserver-ee\tomcat\lib\

But It still doesn't work...

Any idea?, any help will be greatly appreciated.

Границы листа

$
0
0
Подскажите как убрать грацницы ?
Не могу редактировать элементы.
pr.PNG
Attached Images

Убрать краницы листа ?

$
0
0
Помогите как убрать грницы у листа т.к. отчёт большой и редактировать невозможно из-за них.
pr.PNG
Attached Images

newMapComponent - dynamic source for shape information?

$
0
0
The NewMapComponent for CDE is great, but it would be even better if we could get the list of shape information dynamically from a geodatabase, instead of a fixed KML file. Some KML files are too large, and without SQL queries would needlessly clog the application with unneeded data.

Is there any way to do that?

Instances preinitialized randomly with 0 values

$
0
0
Hello there!


I am experiencing a strange behavior when using the WEKA API. Note that I've been using WEKA extensively for almost two years now so this may not be a trivial problem. Some weeks ago I got a new laptop with a Core i7-4710HQ CPU. I mention that because the problem seems to be related to computing speed.


Using the old laptop or when running my code in debug mode, everything is fine and behaves as expected. However, when running it normally, a wired thing happens: when I create a new Instance by writing


Code:

Instance i = new DenseInstance(instances.numAttributes());
which basically should create an empty instance with as many missing values as the parameter says, it instead creates an instance which has some values set to 0.


This not always happens, but in about 1 of 4 cases. It very much depends on how fast the instance is further processed.

The result looks something like this:
Code:

?,?,?,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,?
?,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?
?,?,?,?,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?
?,?,?,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,?
?,?,?,?,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?
?,?,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0
?,?,?,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,?
?,?,?,?,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?
?,?,?,?,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?,0,0,?,?


Due to the regularities it seems as if there was a memory problem. Does anybody have a clue or idea on how to prevent that instead of artificially slowing down the code??
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>