Hacker News new | past | comments | ask | show | jobs | submit login

We have 23TB of images stored in S3 and I was recently looking at moving them to Backblaze to save hundreds of dollars per month. These are all individual image files, because reasons.

Then I realized that S3 Glacier and Deep Archive were even less expensive than B2. I took a bit further of a look and found that Glacier/DA files have some fairly chonky metadata that must be stored in normal S3, and for a lot of our images the metadata was larger than the image in question. So Glacier/DA would increase our storage costs. Over all it probably wasn't a money-saving situation.

The ideal use case is to bundle those up into a tar file or something and store those large files, and manage the metadata and indexing/access ourselves.

So, using rclone to copy 11TB of data to B2.




Wasabi is also a little cheaper than AWS S3 afaik


Keep in mind when you create something in wasabi you pay for it for 3 months, even if you delete it minutes later.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: