用法:
class cuml.dask.naive_bayes.MultinomialNB(*, client=None, verbose=False, **kwargs)
多项式模型的分布式朴素贝叶斯分类器
例子:
从 Scikit-learn 加载 20 个新闻组数据集并训练一个朴素贝叶斯分类器。
import cupy as cp from sklearn.datasets import fetch_20newsgroups from sklearn.feature_extraction.text import CountVectorizer from dask_cuda import LocalCUDACluster from dask.distributed import Client from cuml.dask.common import to_sparse_dask_array from cuml.dask.naive_bayes import MultinomialNB # Create a local CUDA cluster cluster = LocalCUDACluster() client = Client(cluster) # Load corpus twenty_train = fetch_20newsgroups(subset='train', shuffle=True, random_state=42) cv = CountVectorizer() xformed = cv.fit_transform(twenty_train.data).astype(cp.float32) X = to_sparse_dask_array(xformed, client) y = dask.array.from_array(twenty_train.target, asarray=False, fancy=False).astype(cp.int32) # Train model model = MultinomialNB() model.fit(X, y) # Compute accuracy on training set model.score(X, y)
输出:
0.9244298934936523
相关用法
- Python cuml.dask.feature_extraction.text.TfidfTransformer用法及代码示例
- Python cuml.dask.manifold.UMAP用法及代码示例
- Python cuml.dask.datasets.classification.make_classification用法及代码示例
- Python cuml.dask.decomposition.PCA用法及代码示例
- Python cuml.dask.decomposition.TruncatedSVD用法及代码示例
- Python cuml.dask.preprocessing.LabelBinarizer用法及代码示例
- Python cuml.datasets.make_blobs用法及代码示例
- Python cuml.datasets.make_classification用法及代码示例
- Python cuml.datasets.make_arima用法及代码示例
- Python cuml.datasets.make_regression用法及代码示例
- Python cuml.metrics.pairwise_distances.pairwise_distances用法及代码示例
- Python cuml.neighbors.KNeighborsClassifier用法及代码示例
- Python cuml.ensemble.RandomForestRegressor用法及代码示例
- Python cuml.svm.SVC用法及代码示例
- Python cuml.svm.SVR用法及代码示例
- Python cuml.Lasso用法及代码示例
- Python cuml.tsa.ARIMA.predict用法及代码示例
- Python cuml.multiclass.OneVsRestClassifier用法及代码示例
- Python cuml.preprocessing.LabelBinarizer用法及代码示例
- Python cuml.random_projection.GaussianRandomProjection用法及代码示例
注:本文由纯净天空筛选整理自rapids.ai大神的英文原创作品 cuml.dask.naive_bayes.MultinomialNB。非经特殊声明,原始代码版权归原作者所有,本译文未经允许或授权,请勿转载或复制。