Quantcast
Channel: Pentaho Community Forums
Viewing all articles
Browse latest Browse all 16689

loading into mongodb

$
0
0
Please help. I'm new to pentaho and mongodb. I have been using other commercial etl tools and most of time dealing with rdbms.

We have a requirement to load from sql server to mongodb using pentaho.

we have 3 tables set up in the source like following:


main product table:
prod_key, prod_name, ...

related product table:
prod_key, related_prod_name, related_partnumber, category,...

bundled product table:
prod_key, bundled_prod_name, bundled_partnumber, ...


in Mongodb the structure would be:

_id,
prod_key,
prod_name,

related product {
category1 [ -- if category = 1
{related_prod_name, related_partnumber,...},
{related_prod_name, related_partnumber,...},
{related_prod_name, related_partnumber,...},
.
.
.

],
category2 [ -- if category =2
{related_prod_name, related_partnumber,...},
{related_prod_name, related_partnumber,...},
{related_prod_name, related_partnumber,...},
.
.
.

]
},

bundled product [
{bundled_prod_name, bundled_partnumber,...},
{bundled_prod_name, bundled_partnumber,...},
{bundled_prod_name, bundled_partnumber,...},
.
.
.

]



The idea is: main product table will have all main attributes being straight moved.
related product table and warranty table will only have corresponding substructures. For each product there might be many related products or warranties.


I would like to know if it is possible to form and hold the json structure before loading to mongodb? and then insert everything at one single shot?

Viewing all articles
Browse latest Browse all 16689

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>