We're working on improving the ability to build / install RAPIDS libraries as easily as possible to allow integrating it into non-conda environments and containers and whatnot, but we're not quite there yet.Īlso see this from same conda setup when doing any pip install of anything. I've escalated internally and am trying to push things in the right direction. By downloading and using the software, you agree to fully comply with the terms and conditions of the CUDA EULA. If you insist on cudatoolkit11. What the solver finds is an old version of PyTorch where they did not have proper upper bounds on the dependency version. 1 Like lowkickfighter March 11, 2021, 11:21pm 3 jesus thanks a lot Is its so difficult for nvidia to have a clear list somewhere with the supported cpus I am looking 3 hours for the propert cuda driver. We don't control the CUDA EULA and what can or can't be redistributed. CUDA Toolkit 11.2 Downloads Select Target Platform Click on the green buttons that describe your target platform. 1 Answer Sorted by: 1 The pytorch channel doesn't yet have any pytorch builds compatible with cudatoolkit11.2. The latest, Cuda Toolkit 11.2 Update 2 will support this card. Yes I'm very aware, but we unfortunately can't control the entire ecosystem. One cannot only think of how RAPIDS is built, but how it's used. This does affect RAPIDS use since one will try (of course) to install rapids + tensorflow. cudatoolkit and cudatoolkit-dev do not have matching versions.Ĭudatoolkit-dev is not a package that we're involved in and generally isn't something that's supported by NVIDIA in any way. I was giving context for a problem that does exist in general, which is all the various conda packages available are not entirely consistent.
0 Comments
Leave a Reply. |