I have a small question regarding
I’m trying to replicate your transfer learning notebook but on a language model using bert. However, when running the code on my laptop the memory consumption is huge causing the script to crash ( 16GB RAM + 32GB swap)
opu_mapping = OPUMap(n_components=n_components, simulated=True, max_n_features=25088)
will eventually lead to this being executed in
simulated_device.py line 69-70 & then crashing.
real_comp = rng.normal(loc=0.0, scale=std, size=matrix_shape).astype(np.float32) imag_comp = rng.normal(loc=0.0, scale=std, size=matrix_shape).astype(np.float32)
I tried running the same thing on the actual OPU but it didn’t consume much memory.
I was wondering how you try to simulate the OPU & why does it require that much memory, I would be more than happy if you can give me keywords or papers to do some research on if the answer is too complex to explain here.