I’m looking after the wikipedia dump which is 60GB unzipped and then moving it to DB to then process it from DB for the static site. Can it be done on the free tier? CircleCI has a time limit of one hour so I doubt that would happen on even a large compute instance offered for the free tier. Another thing is that by 1GB cache and like 10GB repo size limits, the storage limit isn’t pointed out. I’m not sure what cache is used for in CI, but surely that’s not storage? I reckon I might go over 100GB between the unzipped file and duplicating it into MongoDB.
Still no answer. I read what is implied by cache, and it doesn’t appear to be the actual disk size. I don’t intend to run run this frequently so I’d erase all of build artifacts. I guess I’ll just try using this if it’s so easy, hopefully I don’t run into some time limit.