Mapd csv import to s3 isn't recognising columns with large string values


#1

I was trying s3 import/copy option. Does it support parquet/avro format?

s3 import with csv file containing columns having lot of character fails.

Similarly for kafkaimport/stremimport do you support parquet/avro record format?

Thanks for the help.


#2

Hi,

No, we do not currently support parquet or avro file formats native in our loader, you need to convert to csv first.

This is something we do have on our list of things to do.

Please send us more details and examples of the issues you are seeing with S3 loads of csv files failing with lots of characters.

regards