tensorflow/models

View on GitHub
official/legacy/detection/modeling/losses.py

Summary

Maintainability
D
1 day
Test Coverage

File losses.py has 513 lines of code (exceeds 250 allowed). Consider refactoring.
Open

# Copyright 2024 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
Severity: Major
Found in official/legacy/detection/modeling/losses.py - About 1 day to fix

    Function focal_loss has 5 arguments (exceeds 4 allowed). Consider refactoring.
    Open

    def focal_loss(logits, targets, alpha, gamma, normalizer):
    Severity: Minor
    Found in official/legacy/detection/modeling/losses.py - About 35 mins to fix

      Similar blocks of code found in 2 locations. Consider refactoring.
      Open

        def __init__(self, params):
          logging.info('FastrcnnBoxLoss huber_loss_delta %s', params.huber_loss_delta)
          # The delta is typically around the mean value of regression target.
          # for instances, the regression targets of 512x512 input with 6 anchors on
          # P2-P6 pyramid is about [0.1, 0.1, 0.2, 0.2].
      Severity: Major
      Found in official/legacy/detection/modeling/losses.py and 1 other location - About 1 hr to fix
      official/legacy/detection/modeling/losses.py on lines 151..157

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 47.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 2 locations. Consider refactoring.
      Open

        def __init__(self, params):
          logging.info('RpnBoxLoss huber_loss_delta %s', params.huber_loss_delta)
          # The delta is typically around the mean value of regression target.
          # for instances, the regression targets of 512x512 input with 6 anchors on
          # P2-P6 pyramid is about [0.1, 0.1, 0.2, 0.2].
      Severity: Major
      Found in official/legacy/detection/modeling/losses.py and 1 other location - About 1 hr to fix
      official/legacy/detection/modeling/losses.py on lines 386..392

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 47.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 3 locations. Consider refactoring.
      Open

        def __init__(self):
          self._categorical_crossentropy = tf_keras.losses.CategoricalCrossentropy(
              reduction=tf_keras.losses.Reduction.SUM, from_logits=True)
      Severity: Minor
      Found in official/legacy/detection/modeling/losses.py and 2 other locations - About 30 mins to fix
      official/legacy/detection/modeling/losses.py on lines 507..509
      official/legacy/detection/modeling/losses.py on lines 697..699

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 32.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 3 locations. Consider refactoring.
      Open

        def __init__(self):
          self._binary_crossentropy = tf_keras.losses.BinaryCrossentropy(
              reduction=tf_keras.losses.Reduction.SUM, from_logits=True)
      Severity: Minor
      Found in official/legacy/detection/modeling/losses.py and 2 other locations - About 30 mins to fix
      official/legacy/detection/modeling/losses.py on lines 344..346
      official/legacy/detection/modeling/losses.py on lines 507..509

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 32.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 3 locations. Consider refactoring.
      Open

        def __init__(self):
          self._binary_crossentropy = tf_keras.losses.BinaryCrossentropy(
              reduction=tf_keras.losses.Reduction.SUM, from_logits=True)
      Severity: Minor
      Found in official/legacy/detection/modeling/losses.py and 2 other locations - About 30 mins to fix
      official/legacy/detection/modeling/losses.py on lines 344..346
      official/legacy/detection/modeling/losses.py on lines 697..699

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 32.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      There are no issues that match your filters.

      Category
      Status