Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

move

$
0
0
How to copy files from windows one directory to another using javascript instead of specifying each filename?

SOURCE DIR
a.txt
b.csv
c.jpg

TARGET DIR
a.txt
b.csv
c.jpg

calculated measure using previous member of selected dimension

$
0
0
hi all,
i have a cube where we get measure as maximum number for select dimension and date

now i have want calculated member as selected date and dimension measure-select dimension and previous date measure

lets say i have

2014-01-01----p1------100
2014-01-02----p1-------200
2014-01-01----p2--------100

i want a calculated measure as when selected is 2014-01-01 cm for p1 is 200-100 =100
i tried something as cm=[measure].[max].currentmember-[measure].[max].previousmember

and same should be happened when we select month as well as year or hour so the select time dimension previous member should be selected is it possible
in cube structure.

please help me is there any other way around to work this


the requirement is we have a energy meter which give reading every 10 minutes i.e 00,10,20,30,40,50. so if i have to calculate the no .of units consumed in an hour i have to differentiate current hour 50th minute reading-previous hour 50th minute reading

so i made a cube which gets max reading for time dimension(hour,day,month,year) what i have achieve is above calucaltion

use multirow result of "table input" as SQL to another "table input"

$
0
0
Hello,
We are migrating data from Oracle or SQL Server to a new database design in MySQL, Oracle or SQL Server (whichever the customer is using). We have certain user-definable columns that exist in every table. We are moving these columns to rows in a standard table that contains all user defined columns. I would like to create a generic job/transformation(s) that runs a query against the database (will be different for Oracle/SQL Server) that will create a query with all the columns for the table that need to be brought over (so I don't have to write a query for every table in our database which is over 1000 tables). The output from the "table input" will be multiple rows containing the SQL query. I want to use this output as the SQL for the next "table input". I'm having a hard time figuring out how to do this. I was able to use the SQL Server "stuff" command to concatenate all rows together into a single column and assign that "field" to a "variable". Well, I cannot use this solution for Oracle because the generated column can be over 4,000 characters and all the options for oracle to put multiple rows to 1 column do not work (my query result gets truncated). So, I need a way to take the multiple row output of a "table input" step and use that as the "SQL" portion of another "table input".


Here's the sample query in SQL Server sytax ('NONCONFORMANCE' will be replaced with a variable indicating table):
SELECT 'SELECT '''+COLUMN_NAME+''' AS REFERENCE_TYPE, '+
CASE WHEN DATA_TYPE IN ('varchar','CHAR','NVARCHAR','NCHAR') THEN COLUMN_NAME
WHEN DATA_TYPE IN ('datetime','date') THEN 'CONVERT(varchar,'+COLUMN_NAME+',109)'
ELSE 'CONVERT(varchar,'+COLUMN_NAME+')' END+
' AS textValue, b.SYSID AS XREF_SYSID'+
' FROM '+TABLE_NAME+' a INNER JOIN XREF_TABLE b ON a.'+PK+' = b.PK AND b.TABLE_NAME = '''+a.TABLE_NAME+''' WHERE '+COLUMN_NAME+' IS NOT NULL'+
CASE WHEN a.ORDINAL_POSITION = b.MAX_POS THEN ' ORDER BY REFERENCE_TYPE' ELSE ' UNION ' END
FROM INFORMATION_SCHEMA.COLUMNS a CROSS JOIN
(SELECT MAX(ORDINAL_POSITION) AS MAX_POS
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME='NONCONFORMANCE'
AND (COLUMN_NAME like '%REFERENCE[_]%' OR
COLUMN_NAME LIKE '%DATE[_]%' OR
COLUMN_NAME LIKE '%NUM[_]%')) b CROSS JOIN
(SELECT COLUMN_NAME AS PK, ROW_NUMBER() OVER(ORDER BY ORDINAL_POSITION) AS rn
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME='NONCONFORMANCE') c
WHERE a.TABLE_NAME='NONCONFORMANCE'
AND (a.COLUMN_NAME like '%REFERENCE[_]%' OR
a.COLUMN_NAME LIKE '%DATE[_]%' OR
a.COLUMN_NAME LIKE '%NUM[_]%')
AND c.rn=1


This is the result of the query (it will be different for every table):
SELECT 'NCM_REFERENCE_1' AS REFERENCE_TYPE, NCM_REFERENCE_1 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_1 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_2' AS REFERENCE_TYPE, NCM_REFERENCE_2 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_2 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_3' AS REFERENCE_TYPE, NCM_REFERENCE_3 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_3 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_4' AS REFERENCE_TYPE, NCM_REFERENCE_4 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_4 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_5' AS REFERENCE_TYPE, NCM_REFERENCE_5 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_5 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_6' AS REFERENCE_TYPE, NCM_REFERENCE_6 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_6 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_7' AS REFERENCE_TYPE, NCM_REFERENCE_7 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_7 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_8' AS REFERENCE_TYPE, NCM_REFERENCE_8 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_8 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_9' AS REFERENCE_TYPE, NCM_REFERENCE_9 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_9 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_10' AS REFERENCE_TYPE, NCM_REFERENCE_10 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_10 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_11' AS REFERENCE_TYPE, NCM_REFERENCE_11 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_11 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_12' AS REFERENCE_TYPE, NCM_REFERENCE_12 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_12 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_13' AS REFERENCE_TYPE, NCM_REFERENCE_13 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_13 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_14' AS REFERENCE_TYPE, NCM_REFERENCE_14 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_14 IS NOT NULL UNION
SELECT 'NCM_REFERENCE_15' AS REFERENCE_TYPE, NCM_REFERENCE_15 AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_REFERENCE_15 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_1' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_1) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_1 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_2' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_2) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_2 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_3' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_3) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_3 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_4' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_4) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_4 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_5' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_5) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_5 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_6' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_6) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_6 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_7' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_7) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_7 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_8' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_8) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_8 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_9' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_9) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_9 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_10' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_10) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_10 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_11' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_11) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_11 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_12' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_12) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_12 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_13' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_13) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_13 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_14' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_14) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_14 IS NOT NULL UNION
SELECT 'NCM_NUM_REFERENCE_15' AS REFERENCE_TYPE, CONVERT(varchar,NCM_NUM_REFERENCE_15) AS textValue, b.SYSID AS XREF_SYSID FROM NONCONFORMANCE a INNER JOIN XREF_TABLE b ON a.NONCONFORMANCE_SYSID = b.PK AND b.TABLE_NAME = 'NONCONFORMANCE' WHERE NCM_NUM_REFERENCE_15 IS NOT NULL ORDER BY REFERENCE_TYPE

Sample Pentaho plugin

$
0
0
Hi folks,
I'm trying to figure out why echo-plugin REST doesn't work. I downloaded the source from pentaho repository, build and deploy into my biserver. I copy the jar to tomcat lib and the echo-plugin folder into system. The html page and action works like a charm but the REST show me a 404 not found response. I wrote a new one based in the echo-plugin and it doesn't work too. Follows the code:


My REST

Code:

package com.dgos.pentaho.xDownloader.rest;


import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import static javax.ws.rs.core.MediaType.APPLICATION_JSON;
import static javax.ws.rs.core.MediaType.APPLICATION_XML;


@Path("/xdownloader/api/download")
public class DownloaderRest {


    @GET
    @Path("/export")
    @Produces({ APPLICATION_XML, APPLICATION_JSON })
    public void getFile() {
        System.out.println ("ENTROU!");
    }


}

Spring XML

Code:

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
    xmlns:context="http://www.springframework.org/schema/context"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ws="http://jax-ws.dev.java.net/spring/core"
    xmlns:wss="http://jax-ws.dev.java.net/spring/servlet"
    xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.5.xsd
                          http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-2.5.xsd
                          http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util-2.5.xsd
                          http://jax-ws.dev.java.net/spring/core http://jax-ws.dev.java.net/spring/core.xsd
                          http://jax-ws.dev.java.net/spring/servlet http://jax-ws.dev.java.net/spring/servlet.xsd">


    <context:annotation-config />


    <!-- REST resources -->
    <bean id="api" class="org.pentaho.platform.web.servlet.JAXRSPluginServlet" />
    <bean class="com.dgos.pentaho.xDownloader.rest.DownloaderRest" />


</beans>

Plugin xml

Code:

<?xml version="1.0" encoding="UTF-8"?>
<plugin title='xDownloader plugin'>
  <bean id='xDownloaderAction' class='org.dgos.xDownloader.xDownloaderAction' />
</plugin>

My action is empty for now.

I call the rest like this:
http://localhost:8080/mypentaho/plugin/xdownloader/api/download/export
Why the plugin doesn't work?
Any suggestion?

Weird Kerberos auth happening when you SFTP from Kettle in Windows

$
0
0
I have a job with a few SFTP steps in it that work fine when run from spoon. If I run them using Kettle from the command line though, on Windows, I get the following weird prompts ever time the job gets to a SFTP step. If I hit enter twice it seems to move on happily but this makes it difficult to schedule.

Here are the prompts:

Kerberos username [gkanks]:
Kerberos password for gkanks:

I am running 5.0.1.A on Windows 8.

I do not see any obvious setting I could change in the step or job.

Any ideas?

BI Server CE 5.1 Oracle JDBC connection through "Manage Data Sources" problem

Multiple queries with inline subreports

$
0
0
I have a task of creating a report using two different mongodb collections and PRD 5.1.0. The report will have rows that use fields from the different collections. I experimented with banded subreports, but realized quickly that the subreport itself executes the entire query. This will not work. I am basically in need of a way to have field attributes from two collections located in the same row. This would mean that for a given row of field-attributes, I need just one record at a time returned from both queries. The two collections have the same amount of data and can be sorted accordingly such that I get corresponding records.


Basically, can I use subreporting (inline or banding in some way) to get the report to look/work as though I am using a single query? Is this possible?

SAP Testing Online Training in USA | Online SAP Testing Training in INDIA | UK

$
0
0
Teksonit offers SAP Testing Online Training. Our SAP Testing trainers come with vast work experience and teaching skills. All our faculty is dedicated so that we can complete your course as per the schedule given we also also you to record the classes from your end so that you can refer the classes once again when ever is required. We also give interview support & Technical support. We market your profile in USA, UK SINGAPORE, INDIA, SOUTH AFRICA. After completion of your course we will help you to clear your interviews and also assist you to get certified on SAP Testing.
SAP Testing Training Course Content Outline :
1. Overview of Testing
2. Definition
3. Importance of Testing
4. Concepts of Testing
5. Software Development Life Cycle (SDLC)
6. Software Testing Life Cycle (STLC)
7. Defect Management Life Cycle/ Bug Life Cycle
8. Get interacted with SAP consultants
9. Revised content from time to time
10. Extended support after the course completion
Contact us : India : +91 9391855249
USA : +1 010-674-9448
Email : teksonit@gmail.com URL: http://www.teksonit.com/sap-testing-online-training/

Hadoop Online Training in INDIA | Big data Online Training in INDIA

$
0
0
TEKSONIT providing Hadoop Online Training based on specific needs of the learners especially we will give innovative one to one Classes which have great opportunities in the present IT market. After completion of your course we will help you to clear your interviews and also assist you to get certified on Hadoop. We will give you 100% Satisfaction and we provide best quality real time online training.

Some of the Hadoop Course topics that covered by our professionals:


  1. Introduction to BIG Data Hadoop
  2. Parallel Computers vs. Distributed Computing technique
  3. Hadoop Instalation on your system
  4. How to install Hadoop cluster on multiple nodes
  5. Hadoop Daemons introduction: Name Node, Data Node, Job Tracker and Task Tracker
  6. Exploring the HDFS Apache Web UI & Exploring HDFS (Hadoop Distributed File System)
  7. Architecture of Name Node (FsImage, EditLog & location of replicas) Secondary Name Node Architecture
  8. Data Node architecture
  9. Exploring JobTracker & TaskTracker
  10. How a client submits a Map-Reduce job
  11. Exploring Mapper,Reducer,Combiner


And many sub topics are there for more details please go through the website.

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Number:
USA: +1 010-674-9448,
INDIA: +91 939-185-5249,

Email: teksonit@gmail.com,

Web: http://www.teksonit.com/hadoop-online-training/

Online SAP MM training in Hyderabad | SAP MM online Training in India

$
0
0
TeksonIT training facility offers SAP MM Online Training. TeksonIT has taken great steps in providing best quality Online Classes. We placed Our students in INDIA,USA, UK, SINGAPORE, NEWZELAND, CANADA, AUSTRALIA,JAPAN, SWEDEN. We also Provide Job support & Interview Support. Our SAP MM trainers come with vast work experience and teaching skills. SAP MM training online is your one stop solution to learn SAP MM at the comfort of your home with flexible schedules.

SAP MM online training content :


  • Introduction to ERP
  • SAP Navigation
  • Master data
  • Inventory management
  • Invoice verification


Highlights in our training:


  • Very in depth course material with real time scenarios.
  • We are providing class with highly qualified trainer.
  • We will provide class and demo session at student flexible timings.
  • In training case studies and real time scenarios covered.
  • We will give 24*7 technical supports.


And many sub topics are there for more details please go through the website.

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Number: India: +91 93918 55249
USA: +1 010 674 9448
Email: teksonit@gmail.com
Web : http://www.teksonit.com/sap-mm-online-training/

Sharepoint Online Training in India | UK | USA | Canada | Australia | Singapore

$
0
0
Sharepoint Online Training course by SUN IT Labs. We have excellent experienced IT professionals who have more then 10+ Years of real time experience Our trainers has good training experience so that best quality output will be delivered. We also give interview support & Technical support. We market your profile in USA, UK SINGAPORE, INDIA, and SOUTH AFRICA. We will give you 100% Satisfaction and we provide best quality real time online training. We have completed more than 200 Sharepoint batches through Online Sharepoint Training program, Our Sharepoint Classes covers all the real time scenarios, and it’s completely on Hands-on for each and every session.

Some of Sharepoint topics Covered by our professionals:

Introduction to SharePoint

• This course covers 4 modules i.e.
• Using SharePoint as an end user
• Designing sites as Designer
• Developing applications as SharePoint developer
• Administering sites as an administrator
• Introduction to SharePoint
• Introduction to SharePoint
• Introduction to Windows SharePoint Server
• 0Introduction to Microsoft Office SharePoint Server
• Other Key SharePoint Development Areas
• Windows SharePoint Services
• WSS Basics
• Content Types and Sites Columns
• 5Templates and Features

And many more sub topics are there, for more details go to through the our Professional website

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Number: USA+1 512 234 3553,

Email: Contact@Sunitlabs.com,

Web: http://sunitlabs.com/sharepoint-online-training/

Online MSBI Training in USA | MSBI Training Online UK

$
0
0
Sun IT Labs Provides Best Online MSBI Training by a team of well Talented trainers with real time Experience in MSBI Training to make you work on real-time projects to get ready for new Job. We have trained thousands of students in MSBI as well as in other various technologies. In MSBI Training we have ably adapted to the fast-changing real time environment and student expectations. We also help our trainers to find placements in various MNC’s Companies.

Some of the MSBI Course topics that covered by our professionals:

1.SQL Server 2008 Analysis Services.
2.What Is Microsoft BI?
3.Core concept – BI is the cube or UDM.
4.Example cube as seen using Excel pivot table.
5.MS BI is comprehensive – more than Analysis Services on SQL Server.
6.Demonstration of SQL Reporting Services with cube as data source.
7.Introduction to OLAP Modeling.
8.Modeling source schemas—stars and snowflakes.
9.Understanding dimensional modeling— Dimensions (Type 1, 2, or 3) or rapidly changing.
10.Understanding fact (measures) and cube modeling.

And many sub topics are there for more details please go through the website.

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Us: USA: +1 512 234 3553,

Email: sunitlabs@gmail.com,

Web: http://sunitlabs.com/msbi-ssis-online-training/

Pentaho London Usergroup - BigData! 22nd July

$
0
0
Just over a week to go until the next Pentaho London usergroup. We have an amazing lineup, 5 rockstars covering the latest and greatest in the BigData and Pentaho space. Register here: http://www.meetup.com/Pentaho-London...nts/178590472/ An earlier start than usual from member feedback. Leaves more networking and time for food afterwards!

WTS : Canon EOS 5D Mark III 22.3 MP Digital SLR Camera.....$2,500

$
0
0
Authorize wholesaler / reseller so our price can not be the same as the retail price.Most retailers do get there

order from us and resell, But we get all our products directly from there various respective manufacturer in

bulk, and that is why we offer them for those prices.

WE SELL ALL BRAND AND MODEL OF PHONES, CAMERA,LAPTOPS, MUSIC INSTRUMENTS. YOU

CAN CONTACT ME BELOW FOR MORE DETAILS.

Sales Rep : Gryson Raymond
Send Enquiry to : mobtechsales@gmail.com
Website : www.mobtechlimited.com


CHECK OUT ITEMS WE STOCK ON SALES : ---

Apple iphone ::

Apple iPhone 5S 64GB Black/White :... $450
Apple iPhone 5C 32GB Black/White :... $380
Apple iPhone 5 64GB.............................. $300


Apple iPads ::

Apple iPad Air Wi-Fi 128GB..................................$500
Apple iPad mini 2 Wi-Fi + Cellular 128GB .........$455

Sony ::

Sony Xperia Z1 Unlocked..................................$300
Sony Xperia Z2 Unlocked.................................$500
Sony Xperia Z2 Tablet......................................$450




Samsung ::

Samsung Galaxy Note 3.....................$400
Samsung Galaxy S4............................$399
Samsung Galaxy S5............................$450

HTC ::

HTC One M8 Unlocked.................$499
HTC One Unlocked.......................$399

LG Nexus 5....................................$350
LG G2............................................$350
Motorola Moto G Unlocked..........$150



Cameras :

Canon EOS 5D Mark III 22.3 MP Digital SLR Camera.....$2,500
Canon EF 70-200mm f/2.8L IS II USM Lens....................$1,100
Canon EOS 70D DSLR Camera (Body Only)......................$805

Nikon D800 Digital SLR Camera (Body Only)............................$1,700
Nikon AF-S Nikkor 24-70mm f/2.8G ED Autofocus Lens.........$$999
Nikon D610 DSLR Camera (Body Only).....................................$1,399

Contact us : mobtechsales@gmail.com
Website : www.mobtechlimited.com

We delivery World Wide.
Drop shipping is available, we can drop ship individual orders directly to your clients.



Sales Rep : Gryson Raymond
Send Enquiry to : mobtechsales@gmail.com
Website : www.mobtechlimited.com

data missing from export in PRD 5.1.0

$
0
0
I recently downloaded and am using the new stable version of PRD (i.e. 5.1.0). With this new version when I preview a report or export the report to a PDF some pages are missing data. For a test I tried the same report in PRD 5.0 and the preview was not missing any data.

Is this a known issue? Or is there something I am doing wrong?

Thanks.

Datastage Online Training in INDIA | Online Datastage Training in Hyderabad

$
0
0
Teksonit Services provides Datastage Online Training Class course by excellent experienced IT professionals who has more then 10+ Years of real time experience our trainers. We are into this field with passion and dedication we are in to online trainings from many years. Learners can grasp the technology-subject from our highly experienced & certified trainers which will be helping the students to work in real time projects.

Some of the Datastage Course topics that covered by our professionals:


  1. Introduction about Data Stage
  2. Difference between Server Jobs and Parallel Jobs
  3. Difference between Pipeline Parallelisms
  4. Partition techniques (Round Robin, Random, Hash, Entire, Same, Modules,Range,DB2,Auto)
  5. Configuration File
  6. Difference between SMP/MPD architecture
  7. Data Stage Components (Server components/Client Components)
  8. Package Installer
  9. Data Stage Administrator
  10. Creating project, Editing project and Deleting project
  11. Permissions to user


And many sub topics are there for more details please go through the website.

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Number:
USA: +1 010-674-9448,
INDIA: +91 939-185-5249,

Email: teksonit@gmail.com,

Web: http://www.teksonit.com/datastage-online-training/

SAP BO Online Training in INDIA | Online SAP BO Training in USA | UK

$
0
0
Teksonit provides the best Software’s training for various Computer IT courses through Webex, Goto meeting. We are providing SAP BO Training based on specific needs of the learners especially we will give innovative one to one Classes which has great opportunities in the present IT market. We also provides Class room course to contend with today’s competitive IT world. We placed Our students in INDIA,USA, UK, SINGAPORE, NEWZELAND, CANADA,AUSTRALIA,JAPAN,SWEDEN. We also Provide Job support & Interview Support.
SAP BO Online Training Course Content:
1. SAP BO Integration Kit
2. SAP BO integrated solution approach
3. Server administration
4. SAP Authorization administration scenario
5. Overview
6. Integrated solution architecture
7. OLAP Universe design approach
8. Connectivity Options
9. Variables &amp; Prompting
10. Hierarchies
Contact us : India : +91 9391855249
USA : +1 010-674-9448
Email : teksonit@gmail.com URL: http://www.teksonit.com/sap-bo-online-training/

Why Pentaho switch case doesn't break after run the case?

$
0
0
i built switch case in `Pentaho` to run predetermined case for varying parameter value.


the problem is `Pentaho` don't break after execution the `case`. so it run all cases! not only the required case.


i attached example of switch case for output text files.


when i run the `Transform`. the switch case run all cases without any break. and it output the three text files !.

switch.jpg

so what i do to solve these problem? i know that `Pentaho Transfrom` run in `Parallel`, but why it doesn't break the switch case !?


This is the `.ktr` file contents:


Code:

<?xml version="1.0" encoding="UTF-8"?>
    <transformation>
      <info>
        <name>trans</name>
        <description/>
        <extended_description/>
        <trans_version/>
        <trans_type>Normal</trans_type>
        <trans_status>0</trans_status>
        <directory>&#x2f;</directory>
        <parameters>
            <parameter>
                <name>var</name>
                <default_value/>
                <description/>
            </parameter>
        </parameters>
        <log>
    <trans-log-table><connection/>
    <schema/>
    <table/>
    <size_limit_lines/>
    <interval/>
    <timeout_days/>
    <field><id>ID_BATCH</id><enabled>Y</enabled><name>ID_BATCH</name></field><field><id>CHANNEL_ID</id><enabled>Y</enabled><name>CHANNEL_ID</name></field><field><id>TRANSNAME</id><enabled>Y</enabled><name>TRANSNAME</name></field><field><id>STATUS</id><enabled>Y</enabled><name>STATUS</name></field><field><id>LINES_READ</id><enabled>Y</enabled><name>LINES_READ</name><subject/></field><field><id>LINES_WRITTEN</id><enabled>Y</enabled><name>LINES_WRITTEN</name><subject/></field><field><id>LINES_UPDATED</id><enabled>Y</enabled><name>LINES_UPDATED</name><subject/></field><field><id>LINES_INPUT</id><enabled>Y</enabled><name>LINES_INPUT</name><subject/></field><field><id>LINES_OUTPUT</id><enabled>Y</enabled><name>LINES_OUTPUT</name><subject/></field><field><id>LINES_REJECTED</id><enabled>Y</enabled><name>LINES_REJECTED</name><subject/></field><field><id>ERRORS</id><enabled>Y</enabled><name>ERRORS</name></field><field><id>STARTDATE</id><enabled>Y</enabled><name>STARTDATE</name></field><field><id>ENDDATE</id><enabled>Y</enabled><name>ENDDATE</name></field><field><id>LOGDATE</id><enabled>Y</enabled><name>LOGDATE</name></field><field><id>DEPDATE</id><enabled>Y</enabled><name>DEPDATE</name></field><field><id>REPLAYDATE</id><enabled>Y</enabled><name>REPLAYDATE</name></field><field><id>LOG_FIELD</id><enabled>Y</enabled><name>LOG_FIELD</name></field><field><id>EXECUTING_SERVER</id><enabled>N</enabled><name>EXECUTING_SERVER</name></field><field><id>EXECUTING_USER</id><enabled>N</enabled><name>EXECUTING_USER</name></field><field><id>CLIENT</id><enabled>N</enabled><name>CLIENT</name></field></trans-log-table>
    <perf-log-table><connection/>
    <schema/>
    <table/>
    <interval/>
    <timeout_days/>
    <field><id>ID_BATCH</id><enabled>Y</enabled><name>ID_BATCH</name></field><field><id>SEQ_NR</id><enabled>Y</enabled><name>SEQ_NR</name></field><field><id>LOGDATE</id><enabled>Y</enabled><name>LOGDATE</name></field><field><id>TRANSNAME</id><enabled>Y</enabled><name>TRANSNAME</name></field><field><id>STEPNAME</id><enabled>Y</enabled><name>STEPNAME</name></field><field><id>STEP_COPY</id><enabled>Y</enabled><name>STEP_COPY</name></field><field><id>LINES_READ</id><enabled>Y</enabled><name>LINES_READ</name></field><field><id>LINES_WRITTEN</id><enabled>Y</enabled><name>LINES_WRITTEN</name></field><field><id>LINES_UPDATED</id><enabled>Y</enabled><name>LINES_UPDATED</name></field><field><id>LINES_INPUT</id><enabled>Y</enabled><name>LINES_INPUT</name></field><field><id>LINES_OUTPUT</id><enabled>Y</enabled><name>LINES_OUTPUT</name></field><field><id>LINES_REJECTED</id><enabled>Y</enabled><name>LINES_REJECTED</name></field><field><id>ERRORS</id><enabled>Y</enabled><name>ERRORS</name></field><field><id>INPUT_BUFFER_ROWS</id><enabled>Y</enabled><name>INPUT_BUFFER_ROWS</name></field><field><id>OUTPUT_BUFFER_ROWS</id><enabled>Y</enabled><name>OUTPUT_BUFFER_ROWS</name></field></perf-log-table>
    <channel-log-table><connection/>
    <schema/>
    <table/>
    <timeout_days/>
    <field><id>ID_BATCH</id><enabled>Y</enabled><name>ID_BATCH</name></field><field><id>CHANNEL_ID</id><enabled>Y</enabled><name>CHANNEL_ID</name></field><field><id>LOG_DATE</id><enabled>Y</enabled><name>LOG_DATE</name></field><field><id>LOGGING_OBJECT_TYPE</id><enabled>Y</enabled><name>LOGGING_OBJECT_TYPE</name></field><field><id>OBJECT_NAME</id><enabled>Y</enabled><name>OBJECT_NAME</name></field><field><id>OBJECT_COPY</id><enabled>Y</enabled><name>OBJECT_COPY</name></field><field><id>REPOSITORY_DIRECTORY</id><enabled>Y</enabled><name>REPOSITORY_DIRECTORY</name></field><field><id>FILENAME</id><enabled>Y</enabled><name>FILENAME</name></field><field><id>OBJECT_ID</id><enabled>Y</enabled><name>OBJECT_ID</name></field><field><id>OBJECT_REVISION</id><enabled>Y</enabled><name>OBJECT_REVISION</name></field><field><id>PARENT_CHANNEL_ID</id><enabled>Y</enabled><name>PARENT_CHANNEL_ID</name></field><field><id>ROOT_CHANNEL_ID</id><enabled>Y</enabled><name>ROOT_CHANNEL_ID</name></field></channel-log-table>
    <step-log-table><connection/>
    <schema/>
    <table/>
    <timeout_days/>
    <field><id>ID_BATCH</id><enabled>Y</enabled><name>ID_BATCH</name></field><field><id>CHANNEL_ID</id><enabled>Y</enabled><name>CHANNEL_ID</name></field><field><id>LOG_DATE</id><enabled>Y</enabled><name>LOG_DATE</name></field><field><id>TRANSNAME</id><enabled>Y</enabled><name>TRANSNAME</name></field><field><id>STEPNAME</id><enabled>Y</enabled><name>STEPNAME</name></field><field><id>STEP_COPY</id><enabled>Y</enabled><name>STEP_COPY</name></field><field><id>LINES_READ</id><enabled>Y</enabled><name>LINES_READ</name></field><field><id>LINES_WRITTEN</id><enabled>Y</enabled><name>LINES_WRITTEN</name></field><field><id>LINES_UPDATED</id><enabled>Y</enabled><name>LINES_UPDATED</name></field><field><id>LINES_INPUT</id><enabled>Y</enabled><name>LINES_INPUT</name></field><field><id>LINES_OUTPUT</id><enabled>Y</enabled><name>LINES_OUTPUT</name></field><field><id>LINES_REJECTED</id><enabled>Y</enabled><name>LINES_REJECTED</name></field><field><id>ERRORS</id><enabled>Y</enabled><name>ERRORS</name></field><field><id>LOG_FIELD</id><enabled>N</enabled><name>LOG_FIELD</name></field></step-log-table>
    <metrics-log-table><connection/>
    <schema/>
    <table/>
    <timeout_days/>
    <field><id>ID_BATCH</id><enabled>Y</enabled><name>ID_BATCH</name></field><field><id>CHANNEL_ID</id><enabled>Y</enabled><name>CHANNEL_ID</name></field><field><id>LOG_DATE</id><enabled>Y</enabled><name>LOG_DATE</name></field><field><id>METRICS_DATE</id><enabled>Y</enabled><name>METRICS_DATE</name></field><field><id>METRICS_CODE</id><enabled>Y</enabled><name>METRICS_CODE</name></field><field><id>METRICS_DESCRIPTION</id><enabled>Y</enabled><name>METRICS_DESCRIPTION</name></field><field><id>METRICS_SUBJECT</id><enabled>Y</enabled><name>METRICS_SUBJECT</name></field><field><id>METRICS_TYPE</id><enabled>Y</enabled><name>METRICS_TYPE</name></field><field><id>METRICS_VALUE</id><enabled>Y</enabled><name>METRICS_VALUE</name></field></metrics-log-table>
        </log>
        <maxdate>
          <connection/>
          <table/>
          <field/>
          <offset>0.0</offset>
          <maxdiff>0.0</maxdiff>
        </maxdate>
        <size_rowset>10000</size_rowset>
        <sleep_time_empty>50</sleep_time_empty>
        <sleep_time_full>50</sleep_time_full>
        <unique_connections>N</unique_connections>
        <feedback_shown>Y</feedback_shown>
        <feedback_size>50000</feedback_size>
        <using_thread_priorities>Y</using_thread_priorities>
        <shared_objects_file/>
        <capture_step_performance>N</capture_step_performance>
        <step_performance_capturing_delay>1000</step_performance_capturing_delay>
        <step_performance_capturing_size_limit>100</step_performance_capturing_size_limit>
        <dependencies>
        </dependencies>
        <partitionschemas>
        </partitionschemas>
        <slaveservers>
        </slaveservers>
        <clusterschemas>
        </clusterschemas>
      <created_user>-</created_user>
      <created_date>2014&#x2f;07&#x2f;13 12&#x3a;36&#x3a;07.782</created_date>
      <modified_user>-</modified_user>
      <modified_date>2014&#x2f;07&#x2f;13 12&#x3a;36&#x3a;07.782</modified_date>
      </info>
      <notepads>
      </notepads>
      <connection>
        <name>AgileBI</name>
        <server>localhost</server>
        <type>MONETDB</type>
        <access>Native</access>
        <database>pentaho-instaview</database>
        <port>50006</port>
        <username>monetdb</username>
        <password>Encrypted 2be98afc86aa7f2e4cb14a17edb86abd8</password>
        <servername/>
        <data_tablespace/>
        <index_tablespace/>
        <read_only>true</read_only>
        <attributes>
          <attribute><code>EXTRA_OPTION_INFOBRIGHT.characterEncoding</code><attribute>UTF-8</attribute></attribute>
          <attribute><code>EXTRA_OPTION_MYSQL.defaultFetchSize</code><attribute>500</attribute></attribute>
          <attribute><code>EXTRA_OPTION_MYSQL.useCursorFetch</code><attribute>true</attribute></attribute>
          <attribute><code>PORT_NUMBER</code><attribute>50006</attribute></attribute>
          <attribute><code>PRESERVE_RESERVED_WORD_CASE</code><attribute>Y</attribute></attribute>
          <attribute><code>SUPPORTS_BOOLEAN_DATA_TYPE</code><attribute>Y</attribute></attribute>
          <attribute><code>SUPPORTS_TIMESTAMP_DATA_TYPE</code><attribute>Y</attribute></attribute>
        </attributes>
      </connection>
      <order>
      <hop> <from>Switch &#x2f; Case</from><to>Text file output</to><enabled>Y</enabled> </hop>
      <hop> <from>Switch &#x2f; Case</from><to>Text file output 2</to><enabled>Y</enabled> </hop>
      <hop> <from>Switch &#x2f; Case</from><to>Text file output 2 2</to><enabled>Y</enabled> </hop>
      <hop> <from>Text file output</from><to>Dummy &#x28;do nothing&#x29;</to><enabled>Y</enabled> </hop>
      <hop> <from>Generate Rows</from><to>Set field value to a constant</to><enabled>Y</enabled> </hop>
      <hop> <from>Set field value to a constant</from><to>Switch &#x2f; Case</to><enabled>Y</enabled> </hop>
      </order>
      <step>
        <name>Switch &#x2f; Case</name>
        <type>SwitchCase</type>
        <description/>
        <distribute>Y</distribute>
        <custom_distribution/>
        <copies>1</copies>
            <partitioning>
              <method>none</method>
              <schema_name/>
              </partitioning>
    <fieldname>var</fieldname>
    <use_contains>N</use_contains>
    <case_value_type>String</case_value_type>
    <case_value_format/>
    <case_value_decimal/>
    <case_value_group/>
    <default_target_step/>
    <cases><case><value>a</value>
    <target_step>Text file output</target_step>
    </case><case><value>b</value>
    <target_step>Text file output 2</target_step>
    </case><case><value>c</value>
    <target_step>Text file output 2 2</target_step>
    </case></cases>    <cluster_schema/>
    <remotesteps>  <input>  </input>  <output>  </output> </remotesteps>    <GUI>
          <xloc>271</xloc>
          <yloc>99</yloc>
          <draw>Y</draw>
          </GUI>
        </step>
   
      <step>
        <name>Generate Rows</name>
        <type>RowGenerator</type>
        <description/>
        <distribute>Y</distribute>
        <custom_distribution/>
        <copies>1</copies>
            <partitioning>
              <method>none</method>
              <schema_name/>
              </partitioning>
        <fields>
          <field>
            <name>var</name>
            <type>String</type>
            <format/>
            <currency/>
            <decimal/>
            <group/>
            <nullif/>
            <length>-1</length>
            <precision>-1</precision>
            <set_empty_string>N</set_empty_string>
          </field>
        </fields>
        <limit>10</limit>
        <never_ending>N</never_ending>
        <interval_in_ms>5000</interval_in_ms>
        <row_time_field>now</row_time_field>
        <last_time_field>FiveSecondsAgo</last_time_field>
        <cluster_schema/>
    <remotesteps>  <input>  </input>  <output>  </output> </remotesteps>    <GUI>
          <xloc>47</xloc>
          <yloc>97</yloc>
          <draw>Y</draw>
          </GUI>
        </step>
   
      <step>
        <name>Text file output</name>
        <type>TextFileOutput</type>
        <description/>
        <distribute>N</distribute>
        <custom_distribution/>
        <copies>1</copies>
            <partitioning>
              <method>none</method>
              <schema_name/>
              </partitioning>
        <separator>&#x3b;</separator>
        <enclosure>&#x22;</enclosure>
        <enclosure_forced>N</enclosure_forced>
        <enclosure_fix_disabled>N</enclosure_fix_disabled>
        <header>Y</header>
        <footer>N</footer>
        <format>DOS</format>
        <compression>None</compression>
        <encoding/>
        <endedLine/>
        <fileNameInField>N</fileNameInField>
        <fileNameField/>
        <create_parent_folder>Y</create_parent_folder>
        <file>
          <name>a</name>
          <is_command>N</is_command>
          <servlet_output>N</servlet_output>
          <do_not_open_new_file_init>N</do_not_open_new_file_init>
          <extention>txt</extention>
          <append>N</append>
          <split>N</split>
          <haspartno>N</haspartno>
          <add_date>N</add_date>
          <add_time>N</add_time>
          <SpecifyFormat>N</SpecifyFormat>
          <date_time_format/>
          <add_to_result_filenames>Y</add_to_result_filenames>
          <pad>N</pad>
          <fast_dump>N</fast_dump>
          <splitevery>0</splitevery>
        </file>
        <fields>
        </fields>
        <cluster_schema/>
    <remotesteps>  <input>  </input>  <output>  </output> </remotesteps>    <GUI>
          <xloc>463</xloc>
          <yloc>98</yloc>
          <draw>Y</draw>
          </GUI>
        </step>
   
      <step>
        <name>Text file output 2</name>
        <type>TextFileOutput</type>
        <description/>
        <distribute>Y</distribute>
        <custom_distribution/>
        <copies>1</copies>
            <partitioning>
              <method>none</method>
              <schema_name/>
              </partitioning>
        <separator>&#x3b;</separator>
        <enclosure>&#x22;</enclosure>
        <enclosure_forced>N</enclosure_forced>
        <enclosure_fix_disabled>N</enclosure_fix_disabled>
        <header>Y</header>
        <footer>N</footer>
        <format>DOS</format>
        <compression>None</compression>
        <encoding/>
        <endedLine/>
        <fileNameInField>N</fileNameInField>
        <fileNameField/>
        <create_parent_folder>Y</create_parent_folder>
        <file>
          <name>b</name>
          <is_command>N</is_command>
          <servlet_output>N</servlet_output>
          <do_not_open_new_file_init>N</do_not_open_new_file_init>
          <extention>txt</extention>
          <append>N</append>
          <split>N</split>
          <haspartno>N</haspartno>
          <add_date>N</add_date>
          <add_time>N</add_time>
          <SpecifyFormat>N</SpecifyFormat>
          <date_time_format/>
          <add_to_result_filenames>Y</add_to_result_filenames>
          <pad>N</pad>
          <fast_dump>N</fast_dump>
          <splitevery>0</splitevery>
        </file>
        <fields>
        </fields>
        <cluster_schema/>
    <remotesteps>  <input>  </input>  <output>  </output> </remotesteps>    <GUI>
          <xloc>463</xloc>
          <yloc>182</yloc>
          <draw>Y</draw>
          </GUI>
        </step>
   
      <step>
        <name>Text file output 2 2</name>
        <type>TextFileOutput</type>
        <description/>
        <distribute>Y</distribute>
        <custom_distribution/>
        <copies>1</copies>
            <partitioning>
              <method>none</method>
              <schema_name/>
              </partitioning>
        <separator>&#x3b;</separator>
        <enclosure>&#x22;</enclosure>
        <enclosure_forced>N</enclosure_forced>
        <enclosure_fix_disabled>N</enclosure_fix_disabled>
        <header>Y</header>
        <footer>N</footer>
        <format>DOS</format>
        <compression>None</compression>
        <encoding/>
        <endedLine/>
        <fileNameInField>N</fileNameInField>
        <fileNameField/>
        <create_parent_folder>Y</create_parent_folder>
        <file>
          <name>c</name>
          <is_command>N</is_command>
          <servlet_output>N</servlet_output>
          <do_not_open_new_file_init>N</do_not_open_new_file_init>
          <extention>txt</extention>
          <append>N</append>
          <split>N</split>
          <haspartno>N</haspartno>
          <add_date>N</add_date>
          <add_time>N</add_time>
          <SpecifyFormat>N</SpecifyFormat>
          <date_time_format/>
          <add_to_result_filenames>Y</add_to_result_filenames>
          <pad>N</pad>
          <fast_dump>N</fast_dump>
          <splitevery>0</splitevery>
        </file>
        <fields>
        </fields>
        <cluster_schema/>
    <remotesteps>  <input>  </input>  <output>  </output> </remotesteps>    <GUI>
          <xloc>463</xloc>
          <yloc>272</yloc>
          <draw>Y</draw>
          </GUI>
        </step>
   
      <step>
        <name>Dummy &#x28;do nothing&#x29;</name>
        <type>Dummy</type>
        <description/>
        <distribute>Y</distribute>
        <custom_distribution/>
        <copies>1</copies>
            <partitioning>
              <method>none</method>
              <schema_name/>
              </partitioning>
        <cluster_schema/>
    <remotesteps>  <input>  </input>  <output>  </output> </remotesteps>    <GUI>
          <xloc>694</xloc>
          <yloc>102</yloc>
          <draw>Y</draw>
          </GUI>
        </step>
   
      <step>
        <name>Set field value to a constant</name>
        <type>SetValueConstant</type>
        <description/>
        <distribute>Y</distribute>
        <custom_distribution/>
        <copies>1</copies>
            <partitioning>
              <method>none</method>
              <schema_name/>
              </partitioning>
      <usevar>Y</usevar>
        <fields>
          <field>
            <name>var</name>
            <value>&#x24;&#x7b;var&#x7d;</value>
            <mask/>
            <set_empty_string>N</set_empty_string>
            </field>
          </fields>
        <cluster_schema/>
    <remotesteps>  <input>  </input>  <output>  </output> </remotesteps>    <GUI>
          <xloc>147</xloc>
          <yloc>97</yloc>
          <draw>Y</draw>
          </GUI>
        </step>
   
      <step_error_handling>
      </step_error_handling>
      <slave-step-copy-partition-distribution>
    </slave-step-copy-partition-distribution>
      <slave_transformation>N</slave_transformation>
    <attributes><group><name>DataService</name>
    <attribute><key>stepname</key>
    <value/>
    </attribute><attribute><key>name</key>
    <value/>
    </attribute></group></attributes>
   
    </transformation>

Thanks,




[1]: http://i.stack.imgur.com/zooYP.png
Attached Images

Convert (WILD) String to Date/Int

$
0
0
Hey, so my database stores the date in this format "17 Apr 2012 | 03:37 pm" varchar

How can I convert it into DATE or INT? Ive tried the Select Values step but I dont have any idea how to define the date format. Kettle wouldnt accept"
dd MM yyyy | HH:mm pm" or "dd MM yyyy | HH:mm"

Help please

[CDA] Using CDATA - (Unparsed) Character Data

$
0
0
Hello everyone!

I'm having trouble using the XML CDATA syntax inside the CDA DataAccess definition (http://www.w3schools.com/xml/xml_cdata.asp)

Can anyone tell me if its supported by CDA and/or provide me a working example please?

This is working using &quot; in order to make CDA recognize the " character inside the string:

Code:

<Parameters>
            <Parameter name="centerArray" type="String" default="[Center].[Center].CurrentMember.Name =  &quot;Center A&quot; OR [Center].[Center].CurrentMember.Name = &quot;Center B&quot;" />
        </Parameters>

And this is what I have tried without success (cda stops working):
Code:

        <Parameters>
        <![CDATA[
            <Parameter name="centerArray" type="String" default="[Center].[Center].CurrentMember.Name =  "Center A" OR [Center].[Center].CurrentMember.Name = "Center B"" />
            ]]>
        </Parameters>

Thanks in advance!
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>