neuro cpcommand. For example:
cifar-10folder on your platform storage.
cifar-10storage folder with Alice and give her
manage-level access to it (this means she will be able to read, change, and delete files in this folder).
data/remote:value in the project's
.neuro/live.yamlfile to keep the full URI of your data. This allows your teammates to use this data folder in their copies of the project (here,
defaultis the name of our default cluster, and
bobis your username on the platform):
/datafolder in the local file system of the jobs you and your teammates work with.
configfolder according to AWS and GCP guides. Note that Git doesn't track these tokens, so your teammates also have to put their tokens in their local copies of the project .
neuro-flow build myimage(this is a necessary step to perform every time you update pip dependencies in
requirements.txtor system requirements in
neuro-flow run jupyter. Notebooks are saved in the
<project>/notebooksfolder on your platform storage. To download them to the local copy of the project, run
neuro-flow download notebooks.
trainjob and run
neuro-flow run train. For example:
HELP.mdfile in your project folder.
neuro status <my-cool-job>.
jupyter-awesome-projectjob with an ID of
job-fb835ab1-5285-4360-8ee1-880a8ebf824cwith Alice (where
awesome-projectis your project's slug), run:
write-level access to your Jupyter Notebooks job, they can modify the notebooks on your platform storage. Therefore, to update those notebooks in the Git repository, you have to download them, commit, and push.
--share <username>when running it.
neuro-flow build myimage, additional dependencies you state in
apt.txtare installed in that environment, which is then saved on the platform's Docker registry. In this case, there is no need to share the images with teammates, as they build similar images from the same code base.
images/myimage/refvariable in the project's