I’m trying to understand the fesability of podcast distribution via IPFS.
The idea is to build a site to convert podcast/RSS feeds into IPFS/RSS feeds. Literally make a copy of the original feed with IPFS gateway urls for the media content. Anyone who uses the IPFS feed will get their media via IPFS gateways.
All the content would remain on the original hosting provider. The IPFS feed would load media files “on-demand” from the original feed into IPFS then naturally disappear as demand goes away.
I don’t want to pin every media file for every podcast in the world (don’t want to host the files), but get them loaded into the IPFS network temporarily until demand drops off.
If someone requests a podcast that’s 5 years old, it will also load “on-demand” (from the original feed) into IPFS for delivery. Not much performance increase in this case, but uses the same mechanism as a new podcast/episode.
I’m thinking I would create hashes for all the urls in the feed without actually downloading the original files and (somehow) stream the data only when requested.
Is on-demand file creation possible? i.e. create a hash, but stream the data only when requested (not pinned/stored in IPFS).
I’ve seen dynamic loading IPLD examples, but not sure if that’s what I need.
Can someone point me in the right direction?
Or let me know “it doesn’t work that way”?