python - scikit-learn 会使用 GPU 吗?

标签 python tensorflow scikit-learn k-means neuraxle

在 TensorFlow 中读取 scikit-learn 的实现:http://learningtensorflow.com/lesson6/和 scikit-learn:http://scikit-learn.org/stable/modules/generated/sklearn.cluster.KMeans.html我正在努力决定使用哪个实现。

scikit-learn 作为 tensorflow docker 容器的一部分安装,因此可以使用任一实现。

使用 scikit-learn 的原因:

scikit-learn contains less boilerplate than the tensorflow implementation.

使用 tensorflow 的原因:

If running on Nvidia GPU the algorithm will be run against in parallel , I'm not sure if scikit-learn will utilize all available GPUs?

阅读 https://www.quora.com/What-are-the-main-differences-between-TensorFlow-and-SciKit-Learn

TensorFlow is more low-level; basically, the Lego bricks that help you to implement machine learning algorithms whereas scikit-learn offers you off-the-shelf algorithms, e.g., algorithms for classification such as SVMs, Random Forests, Logistic Regression, and many, many more. TensorFlow shines if you want to implement deep learning algorithms, since it allows you to take advantage of GPUs for more efficient training.

此声明再次强化了我的断言,即“scikit-learn 包含的样板文件比 tensorflow 实现少”,但也表明 scikit-learn 不会利用所有可用的 GPU?

最佳答案

Tensorflow 仅在基于 Cuda 和 CuDNN 构建时才使用 GPU。默认情况下它不使用 GPU,特别是如果它在 Docker 中运行,除非你使用 nvidia-docker以及具有内置支持的图像。

Scikit-learn 不打算用作深度学习框架,也不提供任何 GPU 支持。

Why is there no support for deep or reinforcement learning / Will there be support for deep or reinforcement learning in scikit-learn?

Deep learning and reinforcement learning both require a rich vocabulary to define an architecture, with deep learning additionally requiring GPUs for efficient computing. However, neither of these fit within the design constraints of scikit-learn; as a result, deep learning and reinforcement learning are currently out of scope for what scikit-learn seeks to achieve.

摘自 http://scikit-learn.org/stable/faq.html#why-is-there-no-support-for-deep-or-reinforcement-learning-will-there-be-support-for-deep-or-reinforcement-learning-in-scikit-learn

Will you add GPU support in scikit-learn?

No, or at least not in the near future. The main reason is that GPU support will introduce many software dependencies and introduce platform specific issues. scikit-learn is designed to be easy to install on a wide variety of platforms. Outside of neural networks, GPUs don’t play a large role in machine learning today, and much larger gains in speed can often be achieved by a careful choice of algorithms.

摘自 http://scikit-learn.org/stable/faq.html#will-you-add-gpu-support

关于python - scikit-learn 会使用 GPU 吗?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41567895/

相关文章:

python - sklearn 中 score 和 accuracy_score 的区别

Python - 同时运行代码(TTS 和打印功能)

python : Why I can't compare indexes of two dataframes when one is random sample

c++ - 将大型 python 对象存储在 RAM 中供以后使用

docker - 构建tensorflow 2.2.0 pip wheel文件,用于CentOS系统(旧libc)

python - Tensorflow 不使用 GPU,发现 xla_gpu 不是 gpu

python - 将方法调用作为变量传递给错误处理函数

tensorflow - 尝试通过 Tensorflow Serving 使用 Universal Sentence Encoder Lite/2

python - sklearn min_impurity_decrease 解释

python-3.x - sklearn 中的交叉验证 : do I need to call fit() as well as cross_val_score()?