Intuition
Revision-based dataset could be extremely huge. For example, the basic Wikipedia Edit History dump is around 25TB when decompressed. So it is really unfriendly to decompress all warehouses at once and use them for training.
Last updated