Creating a decision tree model
Create a model that works well on mid-volume, highly non-linear data.
- In the Model creation step, from the Create model drop-down list, click Decision tree.
- In the Create decision tree workspace, in the Summary section, enter a Model name and a Description. Click Create model.
-
In the
Create model
dialog box,, select one of the splitting
methods:
If Then If you want to select the most statistically significant point to split as measured by the Chi Squared statistic, perform the following actions: - Select the CHAID check box.
- In the Significance is over field, enter the minimum level of significance for splitting.
If you want to select the point to split that has the lowest impurity (the lowest level of cases on the wrong side of the split), perform the following actions: - Select the CART check box.
-
In the
Impurity is under
field, set the maximum level of
impurity for splitting.
Raising the value can generate larger models.
If you want to select the point to split that has the highest information value described by the entropy of the distribution or gain, perform the following actions: - Select the ID3 check box.
-
In the
Gain is over
field, set the minimum level of gain
for splitting.
Lowering the value can generate larger models.
-
Select predictors:
- To select the best predictors in a group, click Use best of each group.
- To select all the predictors, click Use all predictors.
- To choose particular predictors, in the Use predictor column, select the check boxes for the predictors you want to use.
-
Set the
Maximum depth
of the node tree and the
Minimum leaf size.
- Maximum depth
- The maximum distance measured in the number of ancestors from a leaf to the root.
- Minimum leaf size
- The minimum size of a leaf as a percentage of the sample.
The greater the depth and the smaller the minimum, the more specific the predictions can be. However, they can also become less reliable.
- Confirm the settings by clicking Create.
- In the Create Decision Tree workspace, click Submit.