Skip to content

Latest commit

 

History

History
39 lines (34 loc) · 1.91 KB

README.md

File metadata and controls

39 lines (34 loc) · 1.91 KB

Addons - Metrics

Maintainers

Submodule Maintainers Contact Info
cohens_kappa Aakash Nain [email protected]
f_scores Saishruthi Swaminathan [email protected]
r_square Saishruthi Swaminathan [email protected]
matthews_correlation_coefficient I.H. Jhuo [email protected]
multilabel_confusion_matrix Saishruthi Swaminathan [email protected]

Contents

Submodule Metric Reference
cohens_kappa CohenKappa Cohen's Kappa
f_scores F1Score F1 Score
f_scores FBetaScore
r_square RSquare R-Sqaure
matthews_correlation_coefficient Matthews Correlation Coefficient MCC
multilabel_confusion_matrix Multilabel Confusion Matrix mcm

Contribution Guidelines

Standard API

In order to conform with the current API standard, all metrics must:

  • Inherit from tf.keras.metrics.Metric.
  • Register as a keras global object so it can be serialized properly: @tf.keras.utils.register_keras_serializable(package='Addons')
  • Add the addon to the py_library in this sub-package's BUILD file.

Testing Requirements

  • Simple unittests that demonstrate the metric is behaving as expected.
  • When applicable, run all unittests with TensorFlow's @run_in_graph_and_eager_modes (for test method) or run_all_in_graph_and_eager_modes (for TestCase subclass) decorator.
  • Add a py_test to this sub-package's BUILD file.

Documentation Requirements

  • Update the table of contents in this sub-package's README.