Skip to content

packaging: Specify cpuonly for conda meta.yaml #1106

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Dec 18, 2020

Conversation

seemethere
Copy link
Member

@seemethere seemethere commented Dec 18, 2020

Package installation for pytorch binaries was failing out due to
conda/conda-package-handling#71

Default to cpuonly since that won't fail out.

Similar to pytorch/audio#1105

This should resolve issues within the nightly pipeline for conda builds

Signed-off-by: Eli Uriegas [email protected]

@xuzhao9
Copy link
Contributor

xuzhao9 commented Dec 18, 2020

Sorry I am trying to understand what does "cpuonly" mean in this PR? Does it mean we only upload cpu version of torchtext nightly to conda (and by default conda install torchtext -c pytorch-nightly won't have gpu support)?

@seemethere
Copy link
Member Author

Sorry I am trying to understand what does "cpuonly" mean in this PR? Does it mean we only upload cpu version of torchtext nightly to conda (and by default conda install torchtext -c pytorch-nightly won't have gpu support)?

cpuonly is a metapackage that we add as a dependency for cpu versions of pytorch

Basically the way the conda dependency resolver works is that if we specify that we explicitly want cpuonly installed as a package then it should pick the pytorch package that has cpuonly as a dependency (which is our cpu specific package).

The same works for cudatoolkit versions as well, so for example if we have conda install -y -c pytorch pytorch cudatoolkit=11.0 it should install the pytorch package that is compiled with CUDA 11.0 support.

@seemethere seemethere marked this pull request as draft December 18, 2020 19:22
@seemethere
Copy link
Member Author

Actually, @xuzhao9 made me realize that this might not be the best solution since it'd limit the pytorch version you can install with torchtext to one that depends on cpuonly, meaning it'd only install the cpu specific version, let me see if I can resolve this

Package installation for pytorch binaries was failing out due to
conda/conda-package-handling#71

Default to cpuonly since that won't fail out.

Signed-off-by: Eli Uriegas <[email protected]>
@seemethere seemethere marked this pull request as ready for review December 18, 2020 19:34
@seemethere
Copy link
Member Author

Okay I made a change that puts in more in line with what torchaudio is doing which shouldn't limit what people can install with torchtext

@codecov
Copy link

codecov bot commented Dec 18, 2020

Codecov Report

Merging #1106 (f1c7f7d) into master (09f1b85) will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master    #1106   +/-   ##
=======================================
  Coverage   77.54%   77.54%           
=======================================
  Files          45       45           
  Lines        3086     3086           
=======================================
  Hits         2393     2393           
  Misses        693      693           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 09f1b85...f1c7f7d. Read the comment docs.

Copy link
Contributor

@xuzhao9 xuzhao9 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good!

@seemethere seemethere merged commit 697979e into pytorch:master Dec 18, 2020
@seemethere seemethere deleted the ensure_cpuonly branch December 18, 2020 20:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants