hiexam
databricks · Certified-Machine-Learning-Professional · Q425 · multiple_choice · topic_1

Which of the following is a reason for using Jensen-Shannon (JS) distance over a Kolmogorov-Smirnov (KS) test for numer…

Which of the following is a reason for using Jensen-Shannon (JS) distance over a Kolmogorov-Smirnov (KS) test for numeric feature drift detection?
  • A.All of these reasons
  • B.JS is not normalized or smoothed
  • C.None of these reasons
  • D.JS is more robust when working with large datasets
  • E.JS does not require any manual threshold or cutoff determinations
Explanation
Selected Answer: E This is a key advantage of using Jensen-Shannon divergence. It produces a value between 0 and 1, which represents the divergence between two distributions. This value can be interpreted without needing to set arbitrary thresholds or cutoffs. In contrast, the KS test involves comparing the test statistic to a critical value, which can depend on the significance level chosen.

Reference: examtopics_top_comment

Practice with progress tracking

Sign in to track wrong answers, get spaced-repetition reminders, and run timed exam mode.