Probably not, but BitTorrent (BT) might still be a good idea for your problem.
BT divides large files into chunks (so-called pieces) and calculates SHA1 hashes for each piece. Pieces can be loaded individually of each other (out of order and also in parallel). After a piece is downloaded completely(!) the SHA1 is checked and if a corruption is found the piece is discarded and downloaded again.
The size of a piece is variable but per-determined by the torrent creator. The default value piece size 256 KiB. Larger torrents usually use larger piece sizes. For example the ubuntu 16.04 ISO (1.3 GiB) uses 512 KiB. The Caine 7.0 ISO (2.9 GiB) uses 1 MiB.
So, if your piece size is not that big, bittorrent will achieve what you desire.
To save bandwidth, maybe you want to disable some BT features like DHT and PeX and rely only on trackers.
You might also want to limit the number of parallel connections and the number of parallel pieces, in order to complete a piece before your connection might break. (I think this can be achieved with the "in order" setting of qBittorrent)
Also many clients (like qBittorrent) can also use HTTP sources in addition to the torrent protocol. Althou I am not sure if they also to the piece checksum thing for HTTP sources.