Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.


Info
This article supports Audience Studio - Legacy.

To tune predictive scoring, review the Predictive Scores for your master segment. Depending on your Accuracy and AUC (Area Under the ROC Curve), you might need to adjust:

...

  1. Define predictive features.

    Image Modified


  2. Click Select Save and Train on Predictive Scoring to build the predictive model.

    Image Modified


  3. Select Predictive Scores to review the score distributions.

    Image Modified


Interpreting the Accuracy of Prediction

...

The metrics of Accuracy and AUC (Area Under the ROC Curve) can help you determine how to tune your predictive scoring.

Image Modified


You can interpret the value of AUC as follows:

...

  • Add more predictive features

  • Rethink your problem (such as, definition of population, positive samples, and scoring target segments)

  • Revisit your profile set master segment definition (such as, customer data), use more data, and add predictive features

Adding more features, requires that you edit the configuration in Predictive Scoring:

Image Modified

Suggest Predictive Features automatically drops some “likely to be meaningless” attributes, but misclassification can still occur. Therefore, try to add additional columns, which are likely to be informative, to the input boxes based on your understanding of the data.

...

Exceptional accuracy might occur as a result of the choice for the predictive features. Check the feature importance on the dashboard, and determine if the absolute values of importance for top features are unusually large:

Image Modified


To tune predictive scoring when AUC is greater than 0.9

  1. Drop a predictive feature corresponding to the most important feature on the Predictive Scoring.

  2. Click Select Save and Train on Predictive Scoring to rebuild the predictive model.

  3. Select Predictive Scores to review the score distributions.

  4. Repeat these steps until the AUC is between 0.7 and 0.9.

Image Modified



Possibility of Overfitting

The reason for very low or exceptionally high accuracy is sometimes overfitting, sometimes referred to as overtraining. Overfitting occurs when the prediction model memorizes the data points in its entirety rather than by learning patterns. For example, too many store#.

Image Modified


The predictive score for a profile that has specific value in an attribute store is likely to become large, regardless of values in the other attributes. Consequently, predictive scores become meaningless across a master segment, and accuracy can become strangely biased.

...