Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

User Defined Java Class to read/insert dynamic excel ranges

$
0
0
Hi everybody,

I'm using POI API in the User Defined Java Class to extract/insert dynamic data from a large amount of excel files.

I use an external hibernate module to insert data in a SQL Server Data Base.

My problem's that it takes 3 days to treat about 1000 excel files.

How can i optimize this ?

Thank you

starting a talend jobs using pentaho bi server scheduler

$
0
0
hi everyone;

i have talend jobs for ETL i want to schedule from the pentaho server scheduler. i think but not sure, that it is maybe possible to create a kettle job that execute talend jobs, and then schedule that directly from the pentaho bi server scheduler.

i didn't try this yet because i would want to know before if there is some more direct way like maybe creating those executing talend packages and runing them from a xaction file that i then can schedule. dont know really if xaction can execute windows script .bat files.

on the why not create kettle jobs instead of using talend, i am too deep in the project already and just switched from spagobi 3.6 (include talend engine) to pentaho 4.8 wich i think is more stable since i had some problems with the birt reporting engine on it.

thank you for your help

Attribute Selection

$
0
0
I am interested in doing attribute selection using Chi Squared and the other AttributeEvals. This requires Ranker to be the search method. I know that the attributes can be pruned by selecting the best N or by providing a threshold. But is there a way to set a statistical significance level for the attribute like 95% confidence or 99% confidence level? I'm doing binary classification, and I want to be to specify that an attribute should have a difference in the two classes that is statistically significant.

Subreport link problem in Firefox

$
0
0
Hello, when I want to open a subreport within a CDE dashboard in Firefox, not shown and the following appears:
error-subreporte.jpg

This happens when I view in HTML, but if I view it as a PDF works correctly.
Some have no idea what may be going on?.
I'm using the biserver 4.8 in Ubuntu 12.04, and Firefox 22.0

Thank you very much. regards

Marcos
Attached Images

Filter rows doesn't recognize negatives as equals?

$
0
0
So, I have this example here...

Code:

2013/07/02 15:22:32 - Sucess.0 - ------------> Linenr 1------------------------------
2013/07/02 15:22:32 - Sucess.0 - codigoconta = 872.10.07
2013/07/02 15:22:32 - Sucess.0 - valorconta = 80.770.105,55
2013/07/02 15:22:32 - Sucess.0 - (11+12+20)-(14+16) = 80.770.105,55
2013/07/02 15:22:32 - Sucess.0 -
2013/07/02 15:22:32 - Sucess.0 - ====================
2013/07/02 15:22:32 - Err.0 -
2013/07/02 15:22:32 - Err.0 - ------------> Linenr 1------------------------------
2013/07/02 15:22:32 - Err.0 - codigoconta = 872.10.08
2013/07/02 15:22:32 - Err.0 - valorconta = -1.915.937.996,76
2013/07/02 15:22:32 - Err.0 - (11+12+20)-(14+16) = -1.915.937.996,76
2013/07/02 15:22:32 - Err.0 -
2013/07/02 15:22:32 - Err.0 - ====================

next step is a filter rows, which checks if valorconta = (11+12+20)-(14+16). And each exist goes to a "write to log", until I can solve this problem....
Why does positive equals goes to success, but negative equals go to error?

CGG Image not showing up in PRD

$
0
0
EDIT: Issue seems to be CTools related and not PRD related. I posted to the wrong forum, sorry!


Hi, I am running into a very frustrating problem. I am trying to get a CGG chart displaying in a Pentaho Report. I have done it many times before but for some reason it does not want to work for some charts that I have been making. I've followed many guides, this one in particular: http://www.osbi.fr/exporter-en-pdf-v...entaho-cdecdf/

Entering the chart URL with parameters defined in it onto a browser displays the chart just fine. I can display this chart in PRD using image. Chart URL looks something like this:
Quote:

http: // my ip address:8080/pentaho/content/cgg/Draw?script=/folder/chart.js&outputType=png&userid=admin&password=password&paramstart_date=2013-06-01&paramend_date=2013-07-01


Introducing parameters and using image-field causes the graph to no longer display. Post-processing formula of the parameter of the image looks something like this:
Quote:

="http://myipaddress:8080/pentaho/content/cgg/Draw?script=/folder/chart.js&outputType=png&userid=admin&password=password&paramstart_date="&[first_date]&"&paramend_date"&[second_date]

However I do not believe this is a problem with my URL or with the way I have set up my report. I have another .js file for another CGG chart in the same solution in the same folder that displays just fine in PRD when I replace the relevant variables.

I really do not know what the problem might be, I have been working on this issue for a long time and if anyone could give me some insight it would be greatly appreciated.

How to get log file name supplied in kitchen cmd line?

$
0
0
How to get log file name supplied in kitchen.bat cmd line?

Read specific data from unstructured file

$
0
0
Can any one explain me how to retrieve data from an unstructured file and transform it and load into database table

Array Index Out of bound Exception

$
0
0
Hi,

I am using Saiku standalone to view reports for my cube. But I have faced this Exception working with Saiku and have no idea why it is happening.

Quote:

2013-06-28 13:08:05,068 ERROR [org.saiku.web.rest.resources.QueryResource] Cannot execute query (F8DE2B49-B87F-34A5-845D-B579E22BBA6E)
org.saiku.service.util.exception.SaikuServiceException: Can't execute query: F8DE2B49-B87F-34A5-845D-B579E22BBA6E at org.saiku.service.olap.OlapQueryService.execute(OlapQueryService.java:238) at org.saiku.service.olap.OlapQueryService.execute(OlapQueryService.java:203) at org.saiku.web.rest.resources.QueryResource.execute(QueryResource.java:718) at org.saiku.web.rest.resources.QueryResource$$FastClassByCGLIB$$e130f1a0.invoke(<generated>) at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:191) at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:617) at org.saiku.web.rest.resources.QueryResource$$EnhancerByCGLIB$$e19ca8a4.execute(<generated>) at sun.reflect.GeneratedMethodAccessor63.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60) at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185) at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75) at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288) at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147) at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108) at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147) at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84) at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469) at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400) at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349) at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339) at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537) at org.codehaus.enunciate.modules.jersey.EnunciateJerseyServletContainer.service(EnunciateJerseyServletContainer.java:248) at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.codehaus.enunciate.webapp.HTTPRequestContextFilter.doFilter(HTTPRequestContextFilter.java:36) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:343) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:97) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:100) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:78) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:35) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:177) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.authentication.ui.DefaultLoginPageGeneratingFilter.doFilter(DefaultLoginPageGeneratingFilter.java:91) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:187) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:79) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:149) at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:237) at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:167) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:852) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) at java.lang.Thread.run(Thread.java:679)Caused by: org.olap4j.OlapException: mondrian gave exception while executing query at mondrian.olap4j.MondrianOlap4jConnection$Helper.createException(MondrianOlap4jConnection.java:840) at mondrian.olap4j.MondrianOlap4jStatement.executeOlapQueryInternal(MondrianOlap4jStatement.java:423) at mondrian.olap4j.MondrianOlap4jStatement.executeOlapQuery(MondrianOlap4jStatement.java:347) at org.saiku.olap.query.OlapQuery.execute(OlapQuery.java:255) at org.saiku.service.olap.OlapQueryService.execute(OlapQueryService.java:222) ... 69 moreCaused by: mondrian.olap.MondrianException: Mondrian Error:Internal error: Error while executing query [select NON EMPTY Hierarchize(Union(Crossjoin({[Measures].[Hits]}, [Time.Yearly].[Day].Members), Crossjoin({[Measures].[Hits]}, [Time.Yearly].[Hour].Members))) ON COLUMNS, NON EMPTY {Hierarchize({{[Destination].[Pipe Code].Members}, {[Destination].[Domain].Members}, {[Destination].[Application Type].Members}})} ON ROWSfrom [Hits]] at mondrian.resource.MondrianResource$_Def0.ex(MondrianResource.java:972) at mondrian.olap.Util.newInternal(Util.java:2417) at mondrian.olap.Util.newError(Util.java:2433) at mondrian.rolap.RolapConnection.executeInternal(RolapConnection.java:706) at mondrian.rolap.RolapConnection.access$000(RolapConnection.java:51) at mondrian.rolap.RolapConnection$1.call(RolapConnection.java:622) at mondrian.rolap.RolapConnection$1.call(RolapConnection.java:621) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) ... 1 moreCaused by: java.lang.ArrayIndexOutOfBoundsException: 0 at mondrian.rolap.agg.DenseObjectSegmentDataset.getObject(DenseObjectSegmentDataset.java:56) at mondrian.rolap.agg.SegmentWithData.getCellValue(SegmentWithData.java:179) at mondrian.rolap.RolapStar.getCellFromCache(RolapStar.java:157) at mondrian.rolap.agg.AggregationManager.getCellFromCache(AggregationManager.java:188) at mondrian.rolap.FastBatchingCellReader.get(FastBatchingCellReader.java:143) at mondrian.rolap.RolapEvaluator.evaluateCurrent(RolapEvaluator.java:660) at mondrian.olap.fun.CrossJoinFunDef.checkData(CrossJoinFunDef.java:999) at mondrian.olap.fun.CrossJoinFunDef.nonEmptyList(CrossJoinFunDef.java:949) at mondrian.olap.fun.CrossJoinFunDef.nonEmptyOptimizeList(CrossJoinFunDef.java:483) at mondrian.olap.fun.CrossJoinFunDef$BaseListCalc.evaluateList(CrossJoinFunDef.java:356) at mondrian.olap.fun.UnionFunDef$1.evaluateList(UnionFunDef.java:52) at mondrian.olap.fun.HierarchizeFunDef$1.evaluateList(HierarchizeFunDef.java:46) at mondrian.calc.impl.AbstractListCalc.evaluateIterable(AbstractListCalc.java:71) at mondrian.rolap.RolapResult.executeAxis(RolapResult.java:875) at mondrian.rolap.RolapResult.evalLoad(RolapResult.java:700) at mondrian.rolap.RolapResult.loadMembers(RolapResult.java:656) at mondrian.rolap.RolapResult.<init>(RolapResult.java:288) at mondrian.rolap.RolapConnection.executeInternal(RolapConnection.java:671) ... 8 more


This is the MDX query i ran when the exception occurs.
MDX Query: [select NON EMPTY Hierarchize(Union(Crossjoin({[Measures].[Hits]}, [Time.Yearly].[Day].Members), Crossjoin({[Measures].[Hits]}, [Time.Yearly].[Hour].Members))) ON COLUMNS, NON EMPTY {Hierarchize({{[Destination].[Pipe Code].Members}, {[Destination].[Domain].Members}, {[Destination].[Application Type].Members}})} ON ROWS from [Hits]

arrayindex.jpg


Waiting for you replies.
Thank you
Attached Images

Dynamic Dashboard

$
0
0
Hello,

I am looking for a pentaho solution for creating dynamic and interactive dashboard (equivalent of QlikView).

Do you know something like this ?

Thank you for your answer.

Bye

Integer Leading Spaces

$
0
0
We have recently started experimenting with Pentaho Data Integration; so far we have had excellent luck, it really is a fine piece of software. So far I have managed to tackle every problem I ran into thanks to all the previous forum posts and tracker info.

However, this one is just beyond me. The problem is getting really annoying:

- A REST Client retrieves JSON data
- The JSON data is processed by the JSON input step
- Which is in turn passed to a Table output step

Everything works fine, except that for the table output I get the following error:
Data truncation: Out of range value for column 'someinteger' at row 1

So I replaced table output with a text output step. I found that all integers got leading spaces for some reason:
Code:

date;someinteger;"somestring";"anotherstring";anotherinteger
2013-07-03; 506399400611;"Lorem Ipsum";"Amet"; 749271

I managed to replicate this issue with an XML Output as well.

Now, if the data is handled the following way by the stream:
- JSON Input interprets everything as Strings
- And everything is output as strings
==> Then things work. Except that I can't pass a string to a database's int column.

However, if I convert a string to an integer, no matter how, I get a leading space
- if JSON interprets it as an integer, I get the leading space
- if JSON interprets it as a string, and I convert it by select value, I get the leading space
- if JSON interprets it as a string and I convert it in the text output, I get the leading space

I have a wild guess this leading space causes the table output issue in the first place. Is there a way that I can sanitize these integers?
I am using Kettle stable release 4.4.0

Web Tier APIs

$
0
0
hi everyone,
I would like to integrate pentaho in a complex web application. I suppose that the best way to integrate it is via web services because this solution is a very agile way and it guarantees about the isolation of pentaho-bi.


I read these documents


http://wiki.pentaho.com/display/Serv...er+APIs+-+Home
https://docs.google.com/spreadsheet/...2c&hl=en#gid=0


but there are no samples about that.


I would like to know if these web services api are avaible on pentaho-ce and I also want to know if someone has ever used this api via web service.


I hope someone can help me.
regards
Gaetano

How to add serial no.

$
0
0
I want to add serial no to my reports.how to add itemcountfunction in a report? i am using reporting 3.9.1 version

Proof of concept with large mondrian cube

$
0
0
Hi,

I'm writing to find out your suggestions on how best to optimize the following cube / schema. I'm trying to build the proof of concept using Pentaho CE as the BI platform for our company, Oracle DB for data mart storage, and Mondrian for cube design.

The underlying Accounts Receivable data mart has the following stats
DIM_AR 833
DIM_AR_REF 1
DIM_AR_REMARK 1,344,880
DIM_CUSTOMER 1,380,180
DIM_CUSTOMER_FIN 229
DIM_DATE 7,302
FACT_AR 67,476,150
Schema has additional dimension:
DIM_AR_REF 18,610,039 - which is degenerate dimension on FACT_AR.
DIM_CUSTOMER has many fields - around 30 and stores customers from 18 countries/

All dimensions have primary keys, and FACT_AR has indexed foreign keys to all underlying dimensions.
I'm thinking about partitioning Fact table by country / post period. There are many cases that end users want to analyze transactions from just one country, however Country is part of DIM_CUSTOMER so does that makes sense to add country in FACT_AR and create combound key on both DIM_CUSTOMER (priemry) and FACT_AR (foreign) ?
Mondrian Schema attached to this post.

I've also added some aggregation tables.
Any suggestions what should I do to optimize query performance- especially for detailed analysis - when individual rows from fact / customer dimension table needs to be returned ?
Anyone successfully performs such analysis on pentaho with data mart of the size as mine ?
Is Saiku addequete analysis tool or Pentaho legacy analyzer is better ?

Linux Server on which BI server is running has the following parameters, I've assigned 2,5 GB of memory to CATALINA_OPTS in BI Server start up script.
processor : 0
vendor_id : GenuineIntel
cpu family : 6
model : 15
model name : Intel(R) Xeon(R) CPU X5355 @ 2.66GHz
stepping : 11
cpu MHz : 2666.759
cache size : 4096 KB
fpu : yes
fpu_exception : yes
cpuid level : 10
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss syscall nx lm constant_tsc up arch_perfmon pebs bts tsc_reliable nonstop_tsc aperfmperf unfair_spinlock pni ssse3 cx16 hypervisor lahf_lm dts
bogomips : 5333.51
clflush size : 64
cache_alignment : 64
address sizes : 40 bits physical, 48 bits virtual

Pawel J.
Attached Files

Textfile Input Error

$
0
0
hi dear kettlers

I am trying to put my Text-File with datas, which are separated with "|" and no tab, and in which the new rows of datas begin on new lines. see example below:

...
71298767|2012|633870|80.54|
71298767|2012|633937|132.63|
71298767|2012|634038|140.48|
...

In the Step "Text file Input" in Content>Format, i chose DOS. I am usin Windows OS. But why is the error: DOS format was specified but only a single line feed character was found, not 2


Appreciate your suggestion/help
thank you in advance

Deadlock issue?

Reading from HBase in a mapper transformation

$
0
0
Hello,


I am trying to read from HBase in Kettle 4.4.0 (29 October 2012) in my mapper transformation using MapReduce Input and HBase Row Decoder by using this example job/workflows:
http://jira.pentaho.com/browse/PDI-7...mment-tabpanel
I get the following error when executing the job:


No Table was provided


Seems that the mapper / job is unaware of the table name, which is set both in HBase Row Decoder and in Job config dialog as 'hbase.mapred.inputtable' user defined variable.
Either the ISSUE (above link) is not solved or my Kettle version is wrong.
The host hadoop is CDH4.2.1 with HBase 94 and I have the following jar's:
./plugins/pentaho-big-data-plugin/hadoop-configurations/cdh4/hbase-0.92.1-cdh4.0.0.jar
./plugins/pentaho-big-data-plugin/hadoop-configurations/cdh4/pentaho-hadoop-shims-cdh4-1.3.0.jar
Also, reading from HBase from HBase Input step in a non-mapper workflow works.


Thanks

Default Filter Options for User Console

$
0
0
I am wondering if there is a way to default the Filter options for the user console. I have been looking for a while and cannot seem to figure this out.

Currently the filter options for interactive report are:

filteroptions.png


During the design process for Interactive Report we train our users to deselect "Select Distinct" and to add a Row Limit; but we find that to be slightly inconvenient.

Designing report from the user console against our database is very slow without these options.

Is there are way I can deselect Select Distinct and select Row Limit and add value as default values?
Attached Images

Create new fields from values in the existing fields

$
0
0
Hi,

I am getting starting with Pentaho Data Integration and I have a doubt about the following:
I have a file like this:

Field0;Field1;Field2;Field3
971551933594=Number;12=Language;FALSE=Supressed;0=Status
971551933597=Number;1=Language;0=Status;23 June 2013 14:48:27 (1371998907)=Date
971551933601=Number;12=Language;FALSE=Supressed;0=Status

And I need to get the following:

Number;Language;Date;Supresed;Status
971551933594;12;;FALSE;0
971551933597;1;23 June 2013 14:48:27 (1371998907);FALSE;0
971551933601;12;;FALSE;0

That is, I don't know the name of the output fields (but I know all the possible fileds), so I must take it from the field.
I have tried a "Modified Java Script Value" but I need to know how to go over all the fileds for each row.

Thanks in advance,
malefu

how many requests is kettle making when use VFS file path in get xml data step?

$
0
0
I have a file path equals to: zip:http://www.someweb.com/all_xml.zip!, and wildcard equals to ^.*\.xml

Say there are 100 xml files in the zip file, then how many request will kettle make? 100 for each xml file or just 1 for the whole zip file?

Thank you!
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>