ADAPT

ADAPT is aPython package providing some well knowndomain adaptation methods.

The purpose ofdomain adaptation (DA) methods is to handle the common issue encounter inmachine learning where training and testing data are drawn according to different distributions.

Indomain adaptation setting, one is aiming to learn atask with an estimator\(f\) mappinginput data\(X\) into output data\(y\) called alsolabels.\(y\) is either a finite set of integer value(forclassification tasks) or an interval of real values (forregression tasks).

Besides, in this setting, one consider, on one hand, asource domain from which a large sample oflabeled data\((X_S, y_S)\) are available. And in the other hand, atarget domain from whichno (or only a few) labeled data\((X_T, y_T)\) are available.If no labeled target data are available, one refers tounsupervised domain adaptation. If a few labeled target data are availableone refers tosupervised domain adaptation also calledfew-shot learning.

The goal ofdomain adaptation is to build a good estimator\(f_T\) on thetarget domain by leaveraging information from thesource domain.DA methods follow one of these three strategies:

The following part explains each strategy and gives lists of the implemented methods in theADAPT package.

adapt.feature_based: Feature-Based Methods

Feature-based methods are based on the research of common features which have similarbehaviour with respect to thetask onsource andtarget domain.

A new feature representation (often calledencoded feature space) is built with aprojecting application\(\phi\) which aims to correct the difference betweensourceandtarget distributions. Thetask is then learned in thisencoded feature space.

alternate text

Methods

feature_based.PRED([estimator, Xt, yt, ...])

PRED: Feature Augmentation with SrcOnly Prediction

feature_based.FA([estimator, Xt, yt, copy, ...])

FA: Feature Augmentation.

feature_based.CORAL([estimator, Xt, ...])

CORAL: CORrelation ALignment

feature_based.SA([estimator, Xt, ...])

SA : Subspace Alignment

feature_based.TCA([estimator, Xt, ...])

TCA : Transfer Component Analysis

feature_based.fMMD([estimator, Xt, ...])

fMMD : feature Selection with MMD

feature_based.DeepCORAL([encoder, task, Xt, ...])

DeepCORAL: Deep CORrelation ALignment

feature_based.DANN([encoder, task, ...])

DANN: Discriminative Adversarial Neural Network

feature_based.ADDA([encoder, task, ...])

ADDA: Adversarial Discriminative Domain Adaptation

feature_based.WDGRL([encoder, task, ...])

WDGRL: Wasserstein Distance Guided Representation Learning

feature_based.CDAN([encoder, task, ...])

CDAN: Conditional Adversarial Domain Adaptation

feature_based.MCD([encoder, task, Xt, ...])

MCD: Maximum Classifier Discrepancy

feature_based.MDD([encoder, task, Xt, ...])

MDD: Margin Disparity Discrepancy

feature_based.CCSA([encoder, task, Xt, yt, ...])

CCSA : Classification and Contrastive Semantic Alignment

adapt.instance_based: Instance-Based Methods

The general principle of these methods is toreweight labeled training datain order to correct the difference betweensource andtarget distributions.Thisreweighting consists in multiplying, during the training process, the individual loss of each training instance by a positiveweight.

Thereweighted training instances are then directly used to learn the task.

alternate text

Methods

instance_based.LDM([estimator, Xt, copy, ...])

LDM : Linear Discrepancy Minimization

instance_based.KLIEP([estimator, Xt, ...])

KLIEP: Kullback–Leibler Importance Estimation Procedure

instance_based.KMM([estimator, Xt, kernel, ...])

KMM: Kernel Mean Matching

instance_based.ULSIF([estimator, Xt, ...])

ULSIF: Unconstrained Least-Squares Importance Fitting

instance_based.RULSIF([estimator, Xt, ...])

RULSIF: Relative Unconstrained Least-Squares Importance Fitting

instance_based.NearestNeighborsWeighting([...])

NNW : Nearest Neighbors Weighting

instance_based.IWC([estimator, Xt, yt, ...])

IWC: Importance Weighting Classifier

instance_based.IWN([estimator, weighter, ...])

IWN : Importance Weighting Network

instance_based.BalancedWeighting([...])

BW : Balanced Weighting

instance_based.TrAdaBoost([estimator, Xt, ...])

Transfer AdaBoost for Classification

instance_based.TrAdaBoostR2([estimator, Xt, ...])

Transfer AdaBoost for Regression

instance_based.TwoStageTrAdaBoostR2([...])

Two Stage Transfer AdaBoost for Regression

instance_based.WANN([task, weighter, Xt, ...])

WANN : Weighting Adversarial Neural Network

adapt.parameter_based: Parameter-Based Methods

In parameter-based methods, theparameters of one or few pre-trained models built withthesource data are adapted to build a suited model for thetask on thetarget domain.

alternate text

Methods

parameter_based.LinInt([estimator, Xt, yt, ...])

LinInt: Linear Interpolation between SrcOnly and TgtOnly.

parameter_based.RegularTransferLR([...])

Regular Transfer with Linear Regression

parameter_based.RegularTransferLC([...])

Regular Transfer for Linear Classification

parameter_based.RegularTransferNN([task, ...])

Regular Transfer with Neural Network

parameter_based.RegularTransferGP([...])

Regular Transfer with Gaussian Process

parameter_based.FineTuning([encoder, task, ...])

FineTuning : finetunes pretrained networks on target data.

parameter_based.TransferTreeClassifier([...])

TransferTreeClassifier: Modify a source Decision tree on a target dataset.

parameter_based.TransferForestClassifier([...])

TransferForestClassifier: Modify a source Random Forest on a target dataset.

parameter_based.TransferTreeSelector([...])

TransferTreeSelector : Run several decision tree transfer algorithms on a target dataset and select the best one.

parameter_based.TransferForestSelector([...])

TransferForestSelector : Run several decision tree transfer algorithms on a target dataset and select the best one for each tree of the random forest.

adapt.metrics: Metrics

This module contains functions to compute adaptation metrics.

metrics.make_uda_scorer(func, Xs, Xt[, ...])

Make a scorer function from an adapt metric.

metrics.cov_distance(Xs, Xt)

Compute the mean absolute difference between the covariance matrixes of Xs and Xt

metrics.neg_j_score(Xs, Xt[, max_centers, sigma])

Compute the negative J-score between Xs and Xt.

metrics.linear_discrepancy(Xs, Xt[, ...])

Compute the linear discrepancy between Xs and Xt.

metrics.normalized_linear_discrepancy(Xs, Xt)

Compute the normalized linear discrepancy between Xs and Xt.

metrics.frechet_distance(Xs, Xt)

Compute the frechet distance between Xs and Xt.

metrics.normalized_frechet_distance(Xs, Xt)

Compute the normalized frechet distance between Xs and Xt.

metrics.domain_classifier(Xs, Xt[, classifier])

Return 1 minus the mean square error of a classifer disciminating between Xs and Xt.

metrics.reverse_validation(model, Xs, ys, ...)

Reverse validation.

adapt.utils: Utility Functions

This module contains utility functions used in the previous modules.

utils.UpdateLambda([lambda_init, ...])

Update Lambda trade-off

utils.accuracy(y_true, y_pred)

Custom accuracy function which can handle probas vector in both binary and multi classification

utils.check_arrays(X, y, **kwargs)

Check arrays and reshape 1D array in 2D array of shape (-1, 1).

utils.check_estimator([estimator, copy, ...])

Check estimator.

utils.check_network(network[, copy, name, ...])

Check if the given network is a tensorflow Model.

utils.get_default_encoder([name, state])

Return a tensorflow Model of one layer with 10 neurons and a relu activation.

utils.get_default_task([activation, name, state])

Return a tensorflow Model of two hidden layers with 10 neurons each and relu activations.

utils.get_default_discriminator([name, state])

Return a tensorflow Model of two hidden layers with 10 neurons each and relu activations.

utils.GradientHandler(*args, **kwargs)

Multiply gradients with a scalar during backpropagation.

utils.make_classification_da([n_samples, ...])

Generate a classification dataset for DA.

utils.make_regression_da([n_samples, ...])

Generate a regression dataset for DA.

utils.check_sample_weight(sample_weight, X)

Check sample weights.

utils.set_random_seed(random_state)

Set random seed for numpy and Tensorflow

utils.check_fitted_estimator(estimator)

Check Fitted Estimator

utils.check_fitted_network(estimator)

Check Fitted Network

Synthetic Examples

Real Examples