Ygor Serpa
1 min readMay 22, 2020

--

This is actually an incorrect statement. More RAM does not make training faster. What RAM allows you is to either (1) use bigger models or (2) double the batch-size. Among these, (2) can actually give you better performance if and only if your GPU is not reaching its maximum occupancy.

For a better comparison, you should have tuned both Colabs to the maximum batch-size their GPUs can fit. Thus, trying to raise its occupancy to the maximum. Then, I suspect you would see a major improvement for the paid Colab.

You can also rely on nvidia command line tools to probe the current occupancy / GPU-use, but I am not sure if you can do that from Colab.

Apart from that, great article. I myself was unaware that Colab had a paid version.

--

--

Ygor Serpa
Ygor Serpa

Written by Ygor Serpa

Former game developer turned data scientist after falling in love with AI and all its branches.

No responses yet