turicreate.decision_tree_regression.DecisionTreeRegression.get_feature_importance

DecisionTreeRegression.get_feature_importance()

Get the importance of features used by the model.

The measure of importance of feature X is determined by the sum of occurrence of X as a branching node in all trees.

When X is a categorical feature, e.g. “Gender”, the index column contains the value of the feature, e.g. “M” or “F”. When X is a numerical feature, index of X is None.

Returns:
out : SFrame

A table with three columns: name, index, count, ordered by ‘count’ in descending order.

Examples

>>> m.get_feature_importance()
Rows: 31
Data:
    +-----------------------------+-------+-------+
    |             name            | index | count |
    +-----------------------------+-------+-------+
    | DER_mass_transverse_met_lep |  None |   66  |
    |         DER_mass_vis        |  None |   65  |
    |          PRI_tau_pt         |  None |   61  |
    |         DER_mass_MMC        |  None |   59  |
    |      DER_deltar_tau_lep     |  None |   58  |
    |          DER_pt_tot         |  None |   41  |
    |           PRI_met           |  None |   38  |
    |     PRI_jet_leading_eta     |  None |   30  |
    |     DER_deltaeta_jet_jet    |  None |   27  |
    |       DER_mass_jet_jet      |  None |   24  |
    +-----------------------------+-------+-------+
    [31 rows x 3 columns]