streaming - Spark: How to read & write temporary files? -


i need write spark app uses temporary files.

i need download many many large files, read them legacy code, processing, delete file, , write results database.

the files on s3 , take long time download. however, can many @ once, want download large number in parallel. legacy code reads file system.

i think can not avoid creating temporary files. rules spark code reading , writing local files?

this must common issue, haven't found threads or docs talk it. can give me pointer?

many p


Comments

Popular posts from this blog

toolbar - How to add link to user registration inside toobar in admin joomla 3 custom component -

linux - disk space limitation when creating war file -

How to provide Authorization & Authentication using Asp.net, C#? -