Caching remote files with python -
background
we have lot of data files stored on network drive process in python. performance reasons typically copy files local ssd when processing. wish make happen automatically, whenever try open file grab remote version if isn't stored locally, , ideally keep sort of timer delete files after time. files practically never changed not require actual syncing capabilities.
functionality
to sum looking functionality for:
- keeping local cache of files/directories network drive, automatically retrieving remote version when unavailable locally
- support directory structure - is, files stored in directory structure on remote server should duplicated locally requested files
- ideally keep sort of timer expire cached files
it wouldn't difficult me write piece of code self, when possible, prefer rely on existing projects typically give more versatile end result , make of own improvements available other users.
question
i have searched bit around terms python local file cache, file synchronization , like, have found handles caching of function return values. bit surprised because imagine quite general problem, question therefore: there have overlooked, , more importantly, there technical terms describing functionality research.
thank in advance, gregers poulsen
-- update --
due other proprietary software packages forced use windows solution naturally must support this.
Comments
Post a Comment