用法:
class cuml.dask.naive_bayes.MultinomialNB(*, client=None, verbose=False, **kwargs)
多項式模型的分布式樸素貝葉斯分類器
例子:
從 Scikit-learn 加載 20 個新聞組數據集並訓練一個樸素貝葉斯分類器。
import cupy as cp from sklearn.datasets import fetch_20newsgroups from sklearn.feature_extraction.text import CountVectorizer from dask_cuda import LocalCUDACluster from dask.distributed import Client from cuml.dask.common import to_sparse_dask_array from cuml.dask.naive_bayes import MultinomialNB # Create a local CUDA cluster cluster = LocalCUDACluster() client = Client(cluster) # Load corpus twenty_train = fetch_20newsgroups(subset='train', shuffle=True, random_state=42) cv = CountVectorizer() xformed = cv.fit_transform(twenty_train.data).astype(cp.float32) X = to_sparse_dask_array(xformed, client) y = dask.array.from_array(twenty_train.target, asarray=False, fancy=False).astype(cp.int32) # Train model model = MultinomialNB() model.fit(X, y) # Compute accuracy on training set model.score(X, y)
輸出:
0.9244298934936523
相關用法
- Python cuml.dask.feature_extraction.text.TfidfTransformer用法及代碼示例
- Python cuml.dask.manifold.UMAP用法及代碼示例
- Python cuml.dask.datasets.classification.make_classification用法及代碼示例
- Python cuml.dask.decomposition.PCA用法及代碼示例
- Python cuml.dask.decomposition.TruncatedSVD用法及代碼示例
- Python cuml.dask.preprocessing.LabelBinarizer用法及代碼示例
- Python cuml.datasets.make_blobs用法及代碼示例
- Python cuml.datasets.make_classification用法及代碼示例
- Python cuml.datasets.make_arima用法及代碼示例
- Python cuml.datasets.make_regression用法及代碼示例
- Python cuml.metrics.pairwise_distances.pairwise_distances用法及代碼示例
- Python cuml.neighbors.KNeighborsClassifier用法及代碼示例
- Python cuml.ensemble.RandomForestRegressor用法及代碼示例
- Python cuml.svm.SVC用法及代碼示例
- Python cuml.svm.SVR用法及代碼示例
- Python cuml.Lasso用法及代碼示例
- Python cuml.tsa.ARIMA.predict用法及代碼示例
- Python cuml.multiclass.OneVsRestClassifier用法及代碼示例
- Python cuml.preprocessing.LabelBinarizer用法及代碼示例
- Python cuml.random_projection.GaussianRandomProjection用法及代碼示例
注:本文由純淨天空篩選整理自rapids.ai大神的英文原創作品 cuml.dask.naive_bayes.MultinomialNB。非經特殊聲明,原始代碼版權歸原作者所有,本譯文未經允許或授權,請勿轉載或複製。