OpenML
Classifier implementing the k-nearest neighbors vote.
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
Imputation transformer for completing missing values.
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
29 runs0 likes0 downloads0 reach0 impact
A random forest classifier. A random forest is a meta estimator that fits a number of decision tree classifiers on various sub-samples of the dataset and use averaging to improve the predictive…
0 runs0 likes0 downloads0 reach0 impact
Randomized search on hyper parameters. RandomizedSearchCV implements a "fit" and a "score" method. It also implements "predict", "predict_proba", "decision_function", "transform" and…
0 runs0 likes0 downloads0 reach0 impact
An AdaBoost classifier. An AdaBoost [1] classifier is a meta-estimator that begins by fitting a classifier on the original dataset and then fits additional copies of the classifier on the same dataset…
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
9 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
10 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
10 runs0 likes0 downloads0 reach0 impact
An AdaBoost classifier. An AdaBoost [1] classifier is a meta-estimator that begins by fitting a classifier on the original dataset and then fits additional copies of the classifier on the same dataset…
4 runs0 likes0 downloads0 reach0 impact
Encode categorical integer features using a one-hot aka one-of-K scheme. The input to this transformer should be a matrix of integers, denoting the values taken on by categorical (discrete) features.…
0 runs0 likes0 downloads0 reach0 impact
Feature selector that removes all low-variance features. This feature selection algorithm looks only at the features (X), not the desired outputs (y), and can thus be used for unsupervised learning.
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
24 runs0 likes0 downloads0 reach0 impact
Multi-layer Perceptron classifier. This model optimizes the log-loss function using LBFGS or stochastic gradient descent. .. versionadded:: 0.18
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
9 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
6 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
25 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
6 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
0 runs0 likes0 downloads0 reach0 impact
Classifier implementing the k-nearest neighbors vote.
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
17 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
21 runs0 likes0 downloads0 reach0 impact
C-Support Vector Classification. The implementation is based on libsvm. The fit time complexity is more than quadratic with the number of samples which makes it hard to scale to dataset with more than…
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
14 runs0 likes0 downloads0 reach0 impact
Soft Voting/Majority Rule classifier for unfitted estimators. .. versionadded:: 0.17
6 runs0 likes0 downloads0 reach0 impact
Constructs a transformer from an arbitrary callable. A FunctionTransformer forwards its X (and optionally y) arguments to a user-defined function or function object and returns the result of this…
0 runs0 likes0 downloads0 reach0 impact
Concatenates results of multiple transformer objects. This estimator applies a list of transformer objects in parallel to the input data, then concatenates the results. This is useful to combine…
0 runs0 likes0 downloads0 reach0 impact
Soft Voting/Majority Rule classifier for unfitted estimators. .. versionadded:: 0.17
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
An extremely randomized tree classifier. Extra-trees differ from classic decision trees in the way they are built. When looking for the best split to separate the samples of a node into two groups,…
14 runs0 likes0 downloads0 reach0 impact
Kernel Principal component analysis (KPCA) Non-linear dimensionality reduction through the use of kernels (see :ref:`metrics`).
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
20 runs0 likes0 downloads0 reach0 impact
An extra-trees classifier. This class implements a meta estimator that fits a number of randomized decision trees (a.k.a. extra-trees) on various sub-samples of the dataset and use averaging to…
0 runs0 likes0 downloads0 reach0 impact
Randomized search on hyper parameters. RandomizedSearchCV implements a "fit" and a "score" method. It also implements "predict", "predict_proba", "decision_function", "transform" and…
2 runs0 likes0 downloads0 reach0 impact
Gradient Boosting for classification. GB builds an additive model in a forward stage-wise fashion; it allows for the optimization of arbitrary differentiable loss functions. In each stage…
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
18 runs0 likes0 downloads0 reach0 impact
Gaussian Naive Bayes (GaussianNB) Can perform online updates to model parameters via `partial_fit` method. For details on algorithm used to update feature means and variance online, see Stanford CS…
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
7 runs0 likes0 downloads0 reach0 impact
Naive Bayes classifier for multinomial models The multinomial Naive Bayes classifier is suitable for classification with discrete features (e.g., word counts for text classification). The multinomial…
2 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
6 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
3 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
2 runs0 likes0 downloads0 reach0 impact
Imputation transformer for completing missing values.
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
39 runs0 likes0 downloads0 reach0 impact
Standardize features by removing the mean and scaling to unit variance Centering and scaling happen independently on each feature by computing the relevant statistics on the samples in the training…
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
10 runs0 likes0 downloads0 reach0 impact
Generate polynomial and interaction features. Generate a new feature matrix consisting of all polynomial combinations of the features with degree less than or equal to the specified degree. For…
0 runs0 likes0 downloads0 reach0 impact
Principal component analysis (PCA) Linear dimensionality reduction using Singular Value Decomposition of the data to project it to a lower dimensional space. It uses the LAPACK implementation of the…
0 runs0 likes0 downloads0 reach0 impact
A random forest classifier. A random forest is a meta estimator that fits a number of decision tree classifiers on various sub-samples of the dataset and use averaging to improve the predictive…
0 runs0 likes0 downloads0 reach0 impact
Transforms features by scaling each feature to a given range. This estimator scales and translates each feature individually such that it is in the given range on the training set, i.e. between zero…
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
2 runs0 likes0 downloads0 reach0 impact
Logistic Regression (aka logit, MaxEnt) classifier. In the multiclass case, the training algorithm uses the one-vs-rest (OvR) scheme if the 'multi_class' option is set to 'ovr', and uses the cross-…
0 runs0 likes0 downloads0 reach0 impact
Randomized search on hyper parameters. RandomizedSearchCV implements a "fit" and a "score" method. It also implements "predict", "predict_proba", "decision_function", "transform" and…
2 runs0 likes0 downloads0 reach0 impact
Multi-layer Perceptron classifier. This model optimizes the log-loss function using LBFGS or stochastic gradient descent. .. versionadded:: 0.18
0 runs0 likes0 downloads0 reach0 impact
Logistic Regression (aka logit, MaxEnt) classifier. In the multiclass case, the training algorithm uses the one-vs-rest (OvR) scheme if the 'multi_class' option is set to 'ovr', and uses the cross-…
4 runs0 likes0 downloads0 reach0 impact
Classifier implementing the k-nearest neighbors vote.
2 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
18 runs0 likes0 downloads0 reach0 impact
A random forest classifier. A random forest is a meta estimator that fits a number of decision tree classifiers on various sub-samples of the dataset and use averaging to improve the predictive…
19 runs0 likes0 downloads0 reach0 impact
Gradient Boosting for classification. GB builds an additive model in a forward stage-wise fashion; it allows for the optimization of arbitrary differentiable loss functions. In each stage…
8 runs0 likes0 downloads0 reach0 impact
C-Support Vector Classification. The implementation is based on libsvm. The fit time complexity is more than quadratic with the number of samples which makes it hard to scale to dataset with more than…
18 runs0 likes0 downloads0 reach0 impact
Learner mlr.classif.rpart from package(s) rpart.
0 runs0 likes0 downloads0 reach0 impact
Logistic Regression (aka logit, MaxEnt) classifier. In the multiclass case, the training algorithm uses the one-vs-rest (OvR) scheme if the 'multi_class' option is set to 'ovr', and uses the…
2 runs0 likes0 downloads0 reach0 impact
A random forest classifier. A random forest is a meta estimator that fits a number of decision tree classifiers on various sub-samples of the dataset and uses averaging to improve the predictive…
3 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
0 runs0 likes0 downloads0 reach0 impact
Applies transformers to columns of an array or pandas DataFrame. This estimator allows different columns or column subsets of the input to be transformed separately and the features generated by each…
0 runs0 likes0 downloads0 reach0 impact
Implementation of the scikit-learn classifier API for Keras.
0 runs0 likes0 downloads0 reach0 impact
Automatically created tensorflow flow.
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
0 runs0 likes0 downloads0 reach0 impact
Applies transformers to columns of an array or pandas DataFrame. This estimator allows different columns or column subsets of the input to be transformed separately and the features generated by each…
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
1 runs0 likes0 downloads0 reach0 impact
Dimensionality reduction using truncated SVD (aka LSA). This transformer performs linear dimensionality reduction by means of truncated singular value decomposition (SVD). Contrary to PCA, this…
0 runs0 likes0 downloads0 reach0 impact
Imputation transformer for completing missing values.
0 runs0 likes0 downloads0 reach0 impact
Encode categorical features as a one-hot numeric array. The input to this transformer should be an array-like of integers or strings, denoting the values taken on by categorical (discrete) features.…
0 runs0 likes0 downloads0 reach0 impact
Automatically created tensorflow flow.
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
4 runs0 likes0 downloads0 reach0 impact
Encode categorical features as an integer array. The input to this transformer should be an array-like of integers or strings, denoting the values taken on by categorical (discrete) features. The…
0 runs0 likes0 downloads0 reach0 impact
Scale features using statistics that are robust to outliers. This Scaler removes the median and scales the data according to the quantile range (defaults to IQR: Interquartile Range). The IQR is the…
0 runs0 likes0 downloads0 reach0 impact
Classifier implementing the k-nearest neighbors vote.
0 runs0 likes0 downloads0 reach0 impact
Applies transformers to columns of an array or pandas DataFrame. This estimator allows different columns or column subsets of the input to be transformed separately and the features generated by each…
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
0 runs0 likes0 downloads0 reach0 impact
Scale features using statistics that are robust to outliers. This Scaler removes the median and scales the data according to the quantile range (defaults to IQR: Interquartile Range). The IQR is the…
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
0 runs0 likes0 downloads0 reach0 impact
Standardize features by removing the mean and scaling to unit variance The standard score of a sample `x` is calculated as: z = (x - u) / s where `u` is the mean of the training samples or zero if…
0 runs0 likes0 downloads0 reach0 impact
Classifier implementing the k-nearest neighbors vote.
0 runs0 likes0 downloads0 reach0 impact
Classifier implementing the k-nearest neighbors vote.
3 runs0 likes0 downloads0 reach0 impact
Weka implementation.
0 runs0 likes0 downloads0 reach0 impact
Weka implementation.
1 runs0 likes0 downloads0 reach0 impact
Weka implementation.
0 runs0 likes0 downloads0 reach0 impact
Niels Landwehr, Mark Hall, Eibe Frank (2005). Logistic Model Trees. Machine Learning. 95(1-2):161-205. Marc Sumner, Eibe Frank, Mark Hall: Speeding up Logistic Model Tree Induction. In: 9th European…
0 runs0 likes0 downloads0 reach0 impact
Ross Quinlan (1993). C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers, San Mateo, CA.
0 runs0 likes0 downloads0 reach0 impact
J. Friedman, T. Hastie, R. Tibshirani (1998). Additive Logistic Regression: a Statistical View of Boosting. Stanford University.
0 runs0 likes0 downloads0 reach0 impact
Leo Breiman (2001). Random Forests. Machine Learning. 45(1):5-32.
0 runs0 likes0 downloads0 reach0 impact
Ross Quinlan (1993). C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers, San Mateo, CA.
0 runs0 likes0 downloads0 reach0 impact
J. Friedman, T. Hastie, R. Tibshirani (1998). Additive Logistic Regression: a Statistical View of Boosting. Stanford University.
0 runs0 likes0 downloads0 reach0 impact