Accessing intent analysis model evaluation reports
After you build the model, you can evaluate it by using various accuracy measures, such as F-score, precision, and recall. You can also view the test results for each record.
By using in-depth model analysis, you can determine whether the model that you created
produces the results that you expect and reaches your accuracy threshold. By viewing
record-by-record test results, you can fine-tune the training data to make your model more
accurate when you rebuild it.
- To download the model evaluation report to your directory, perform the
following actions:
- Click Download report.
- Save the Model Analysis Report archive file to a local directory.
- Unpack the archive file.The archive file contains the following
.csv
extension files:- test_MAXENT_id_number – Contains all test records. For each test record, you can view the result that you predicted (manual outcome), the result that the model predicted (machine outcome), and whether these result match.
- test_MAXENT_SCORE_SHEET_id_number – Contains accuracy measures for each entity in the model, for example, the number of true positives, precision, recall, and F-score.
- test_DATA_SHEET_id_number – Contains all testing and training records.
- To view the summary results in Prediction Studio:
- Click the Expand icon next to the model name.
- In the Class summary tab, view the number of true positives, precision, recall, and F-score results per each entity type.
- In the Test results tab, for each test record, view the result that you predicted (actual), the result that the model predicted (predicted), and whether these results match.
Previous topic Defining training and testing samples, and building the intent detection model Next topic Saving the intent detection model