It basically comes down to one thing: The IR (internal resistance) of the cells.
A lower IR allows the cell to output more current, with less voltage drop. Period. No way to argue with Ohm's law.

A cell with a higher C rating
should have a lower IR than a lower C rated cell.
If there could ever be a standard rating for cells that would be it. IR @ 70*F.
If anyone sees a flaw in that plan let me know, because I don't see it right now.
Edit: For example, in order for a 975A current to even be possible with ANY cell, their IR can be no greater than 3.8 milli-ohms (0.0038 ohms). This is at the nominal 3.7 volts.