A: During the early release stage of LightOn Cloud, we are trying to make our service available to as many users as possible, without at the same time hindering their ability to find an available time-slot with relative ease. As more OPUs are added to the LightOn Cloud we will be able to satisfy more access requests.
A: Yes, we do. It is here.
A: Any algorithm running on LightOn Cloud is the property of the user. LightOn makes no claim whatsoever on the IP generated by users while using LightOn Cloud. For details please refer to the LightOn Cloud Terms & Conditions
A: Please keep in mind that only the data folder (
~/data ) is persistent across bookings and OPUs. Everything else is erased at the end of your booking. For example, simulation results should go in the data folder.
A: Yes! The data folder (
~/data ) is shared across OPUs and is persistent across bookings as well.
Q: Can some data be shared with other users, if for example, users belong to the same organization, company, or research team?
A: Yes! Users that would like to share some data can have a commonly shared directory. Of course, this directory is invisible to all other users. If you would like to set this up just reach out to us at firstname.lastname@example.org
A: With LightOn Cloud you have access to a CPU, a GPU, and LightOn’s OPU. A simple function call allows computations to be performed on LightOn OPU.
A: One can combine GPU and OPU computation in the same way as GPU and CPU: by running a part of the computations on the GPU and another part on the OPU. Note that the operation performed by the OPU is not differentiable.
A: Examples and papers using LightOn’s OPU are listed in the Deeper Insight section of « Our Technology ».
A: LOC provides a Jupyter Lab interface: before the start of each booking, you will receive a link at your email. You can install any extension to it and you are able to set extension installation at each boot to automate the process. Please note that if you are connected with SSH you can find the Jupyter Lab link with the following command
grep "NotebookApp.token =" /etc/jupyter/jupyter_notebook_config.py
A: You can launch a classic Jupyter Notebook from the Jupyter Lab’s help menu if you like.
- create venv with
python -m venv myenv
- activate with
pip install ipykernel
python -m ipykernel install --user --name=myenv
A: First of all, it is necessary to provide you public SSH key, when asked, after your 1st connection to the booking system. If you haven’t done so, please send an email to support[at]lighton.ai.
Having provided your SSH key you can use:
(e.g. SSH “username”@a.cloud.lighton.ai, if you have booked SATURN A)
A: The good news is that it is expected! It is caused by the server being re-installed at each booking. To disable strict host checking, you may add this in
Host ?.cloud.lighton.ai # Disable host checking for LightOn cloud StrictHostKeyChecking no UserKnownHostsFile /dev/null Host * # general options go here
A: For “root” rights you have to use
sudo -i or
sudo cmd (
su - doesn’t work).
Q: How can I work with HDF5 files? / I am getting the following error: OSError: Unable to open file (unable to lock file, errno = 37, error message = ‘No locks available’)
A: This issue is due to the fact that the storage space is a network filesystem that doesn’t support file locking required by HDF5. To avoid this problem you have to set the
HDF5_USE_FILE_LOCKING environment variable to
If you work in ssh, just execute
export HDF5_USE_FILE_LOCKING=FALSE in your bash session.
If you work in Jupyter notebook, put the line `%env HDF5_USE_FILE_LOCKING=FALSE``` at the beginning of your notebook.