File is too big


#1

Hey guys, we love this over at our startup, but it says our file is too large. Who can we talk to about upping that limit? Thanks!!


#2

Hi,

Need a little bit more context to your request,

I am going to guess you are referring to dragging and dropping a file into Immerse?

regards


#3

Yes sir! Dragging and dropping. I tried to write from Spark directly to MapD but it kept saying Connected Failed

Is there a higher threshold if I write directly? How high is that threshold? Thanks!!


#4

Hi,

For large initial loads we would recommend you land the file on the server and use the COPY FROM command. This is what Immerse does behind the scenes but it has a limitation of how large the files it can transfer are.

This method will allow you to load files into tables with only your disk as the limiting factor. Obviously at query execution time you will need ram, gpu and cpu of an appropriate scale to deal with the size of tables you have loaded :slight_smile:

Regards


#5

Ahhh thank you.

In your expert opinion: How would you recommend landing the file on the server, if I have the CSV locally, and I’m using an AWS backend (https://aws.amazon.com/marketplace/pp/B071H71L2Y?qid=1504819898310&sr=0-2&ref_=srh_res_product_title)


#6

Hi,

Depends somewhat on the size of the file, easiest approach would just be to use scp from local machine to the AWS instance. Are you familiar with scp?

Regards


#7

I am now! Thank you!!