I am a low level user of data and have not used MySQL in a while. I am new to PDI.
My father bought some data for political use and it has over a million rows and 267 columns.
I created a table so he can actually use his data, but I am having trouble getting the data into the table.
I want to know if there is a way to get the data from the csv using pentaho into a fresh table generated by pentaho. I realize that's not ideal, but I'm only 90% sure of my table I created.
Couple things I have done so far:
Using workbench's auto import is taking too long. windows 10 autoupdated (argh) and screwed that experiment
Using a load in file has gone wonky on me. it hangs up on fields, claiming that it is expecting an integer and pulling a quotation mark. I have properly identified the field enclosure, so not sure why it would do that.
trying to use pentaho input csv file output table throws an error almost immediately. I have no clue how to read the error.
My father bought some data for political use and it has over a million rows and 267 columns.
I created a table so he can actually use his data, but I am having trouble getting the data into the table.
I want to know if there is a way to get the data from the csv using pentaho into a fresh table generated by pentaho. I realize that's not ideal, but I'm only 90% sure of my table I created.
Couple things I have done so far:
Using workbench's auto import is taking too long. windows 10 autoupdated (argh) and screwed that experiment
Using a load in file has gone wonky on me. it hangs up on fields, claiming that it is expecting an integer and pulling a quotation mark. I have properly identified the field enclosure, so not sure why it would do that.
trying to use pentaho input csv file output table throws an error almost immediately. I have no clue how to read the error.