# Certified-Machine-Learning-Professional — Question 425

**Type:** multiple_choice
**Topics:** topic_1

## Question

Which of the following is a reason for using Jensen-Shannon (JS) distance over a Kolmogorov-Smirnov (KS) test for numeric feature drift detection?

## Correct Answer

_See scenario._

## Explanation

Selected Answer: E
This is a key advantage of using Jensen-Shannon divergence. It produces a value between 0 and 1, which represents the divergence between two distributions. This value can be interpreted without needing to set arbitrary thresholds or cutoffs. In contrast, the KS test involves comparing the test statistic to a critical value, which can depend on the significance level chosen.

**Reference:** examtopics_top_comment

---
Source: https://hiexam.net/q/databricks/Certified-Machine-Learning-Professional/425  
Practice (tracked): https://hiexam.net/study/Certified-Machine-Learning-Professional/practice