Aquila HUB 0.2.0. adds experimental support for NLP model downloads from IPFS

Hi, everyone. This is Jubin Jose from https://Aquila.Network.

We’re excited to announce the release of Aquila HUB 0.2.0. Yes, we’re adding experimental IPFS support in Aquila HUB. Now you can run an IPFS node and leverage the decentralized distribution of your Fasttext models.

In the Machine Learning community, it’s a common scalability issue that we face while managing large model files. It’s a headache when it comes to trusting a common HTTP endpoint to download this model into a distributed system. Every distributed node in the network has to send a download request to this central server even though the data is already available in some other node - which is inefficient when the file size is a concern. IPFS allows efficient distribution of these large files across networks without relying on a central point for availability. That’s why Aquila HUB is adding experimental support for IPFS in its early version itself. Read more about this release and start using it at