Regression tree option for decision tree
Evaluate a Decision Tree using the Regression Tree option with new sampling and visualization features.
Evaluate a Decision Tree using the Regression Tree option by right-clicking and selecting Options > Regression Tree within a Decision Tree visualization.
Updated Decision Tree builder: The new algorithm was introduced for building a Decision Tree. It handles more general data and provides a more informative visualization to improve the precision of the prediction.
Improved data sampling module: An updated adaptive sampling scheme helps Decision Tree and Propensity Score achieve higher precision results.
Green and red indicate true or false. The saturation of color—such as deep red versus light red—is used to indicate probability. For example, a node with deep red has a very high probability to be false, while a node with light red has a lower probability to be false. A node with deep green has a very high probability to be true.
All Decision Trees have varying branch widths to indicate the level of traffic for that branch of the tree.
In a Decision Tree visualization, right-click and select Options > Regression Tree. When selected, additional settings are provided:
This option controls the complexity of the Regression Tree. Depending on your data, you might need to build a Fine tree (with a complicated structure with more nodes) to get a more meaningful tree classification. If you have much data, then a relatively Coarse tree (less complicated with fewer tree nodes) could work well.
Note: Typical is the default setting. There are some extreme cases where the Typical setting doesn't work as well and the Coarse or Fine setting can provide a better view of the data.