tensorflow/models

View on GitHub
research/object_detection/core/post_processing.py

Summary

Maintainability
F
1 wk
Test Coverage

File post_processing.py has 1102 lines of code (exceeds 250 allowed). Consider refactoring.
Open

# Copyright 2017 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
Severity: Major
Found in research/object_detection/core/post_processing.py - About 2 days to fix

    Function batch_multiclass_non_max_suppression has a Cognitive Complexity of 62 (exceeds 5 allowed). Consider refactoring.
    Open

    def batch_multiclass_non_max_suppression(boxes,
                                             scores,
                                             score_thresh,
                                             iou_thresh,
                                             max_size_per_class,
    Severity: Minor
    Found in research/object_detection/core/post_processing.py - About 1 day to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function multiclass_non_max_suppression has a Cognitive Complexity of 34 (exceeds 5 allowed). Consider refactoring.
    Open

    def multiclass_non_max_suppression(boxes,
                                       scores,
                                       score_thresh,
                                       iou_thresh,
                                       max_size_per_class,
    Severity: Minor
    Found in research/object_detection/core/post_processing.py - About 5 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function class_agnostic_non_max_suppression has a Cognitive Complexity of 30 (exceeds 5 allowed). Consider refactoring.
    Open

    def class_agnostic_non_max_suppression(boxes,
                                           scores,
                                           score_thresh,
                                           iou_thresh,
                                           max_classes_per_detection=1,
    Severity: Minor
    Found in research/object_detection/core/post_processing.py - About 4 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function batch_multiclass_non_max_suppression has 22 arguments (exceeds 4 allowed). Consider refactoring.
    Open

    def batch_multiclass_non_max_suppression(boxes,
    Severity: Major
    Found in research/object_detection/core/post_processing.py - About 2 hrs to fix

      Function multiclass_non_max_suppression has 17 arguments (exceeds 4 allowed). Consider refactoring.
      Open

      def multiclass_non_max_suppression(boxes,
      Severity: Major
      Found in research/object_detection/core/post_processing.py - About 2 hrs to fix

        Function class_agnostic_non_max_suppression has 15 arguments (exceeds 4 allowed). Consider refactoring.
        Open

        def class_agnostic_non_max_suppression(boxes,
        Severity: Major
        Found in research/object_detection/core/post_processing.py - About 1 hr to fix

          Function _validate_boxes_scores_iou_thresh has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
          Open

          def _validate_boxes_scores_iou_thresh(boxes, scores, iou_thresh,
                                                change_coordinate_frame, clip_window):
            """Validates boxes, scores and iou_thresh.
          
            This function validates the boxes, scores, iou_thresh
          Severity: Minor
          Found in research/object_detection/core/post_processing.py - About 55 mins to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Avoid deeply nested control flow statements.
          Open

                    if soft_nms_sigma != 0:
                      raise ValueError('Soft NMS not supported in current TF version!')
                    selected_indices = tf.image.non_max_suppression(
          Severity: Major
          Found in research/object_detection/core/post_processing.py - About 45 mins to fix

            Function _validate_boxes_scores_iou_thresh has 5 arguments (exceeds 4 allowed). Consider refactoring.
            Open

            def _validate_boxes_scores_iou_thresh(boxes, scores, iou_thresh,
            Severity: Minor
            Found in research/object_detection/core/post_processing.py - About 35 mins to fix

              Function partitioned_non_max_suppression_padded has 5 arguments (exceeds 4 allowed). Consider refactoring.
              Open

              def partitioned_non_max_suppression_padded(boxes,
              Severity: Minor
              Found in research/object_detection/core/post_processing.py - About 35 mins to fix

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                        if (hasattr(tf.image, 'non_max_suppression_with_scores') and
                            tf.compat.forward_compatible(2019, 6, 6) and not use_hard_nms):
                          (selected_indices, selected_scores
                          ) = tf.image.non_max_suppression_with_scores(
                              boxlist_and_class_scores.get(),
                Severity: Major
                Found in research/object_detection/core/post_processing.py and 1 other location - About 1 day to fix
                research/object_detection/core/post_processing.py on lines 808..827

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 132.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                      if (hasattr(tf.image, 'non_max_suppression_with_scores') and
                          tf.compat.forward_compatible(2019, 6, 6)):
                        (selected_indices, selected_scores
                        ) = tf.image.non_max_suppression_with_scores(
                            boxlist_and_class_scores.get(),
                Severity: Major
                Found in research/object_detection/core/post_processing.py and 1 other location - About 1 day to fix
                research/object_detection/core/post_processing.py on lines 580..599

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 132.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                    if max_total_size:
                      max_total_size = tf.minimum(max_total_size, sorted_boxes.num_boxes())
                      sorted_boxes = box_list_ops.gather(sorted_boxes, tf.range(max_total_size))
                      num_valid_nms_boxes_cumulative = tf.where(
                          max_total_size > num_valid_nms_boxes_cumulative,
                Severity: Major
                Found in research/object_detection/core/post_processing.py and 1 other location - About 2 hrs to fix
                research/object_detection/core/post_processing.py on lines 865..869

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 57.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                    if max_total_size:
                      max_total_size = tf.minimum(max_total_size, sorted_boxes.num_boxes())
                      sorted_boxes = box_list_ops.gather(sorted_boxes, tf.range(max_total_size))
                      num_valid_nms_boxes = tf.where(max_total_size > num_valid_nms_boxes,
                                                     num_valid_nms_boxes, max_total_size)
                Severity: Major
                Found in research/object_detection/core/post_processing.py and 1 other location - About 2 hrs to fix
                research/object_detection/core/post_processing.py on lines 640..645

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 57.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                        if use_partitioned_nms:
                          (selected_indices, num_valid_nms_boxes,
                           boxlist_and_class_scores.data['boxes'],
                           boxlist_and_class_scores.data['scores'],
                           _) = partitioned_non_max_suppression_padded(
                Severity: Major
                Found in research/object_detection/core/post_processing.py and 1 other location - About 2 hrs to fix
                research/object_detection/core/post_processing.py on lines 782..788

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 54.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                        (selected_indices, num_valid_nms_boxes,
                         boxlist_and_class_scores.data['boxes'],
                         boxlist_and_class_scores.data['scores'],
                         argsort_ids) = partitioned_non_max_suppression_padded(
                             boxlist_and_class_scores.get(),
                Severity: Major
                Found in research/object_detection/core/post_processing.py and 1 other location - About 2 hrs to fix
                research/object_detection/core/post_processing.py on lines 554..561

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 54.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                      if boundaries is not None:
                        per_class_boundaries = per_class_boundaries_list[boxes_idx]
                        boxlist_and_class_scores.add_field(fields.BoxListFields.boundaries,
                                                           per_class_boundaries)
                Severity: Minor
                Found in research/object_detection/core/post_processing.py and 1 other location - About 45 mins to fix
                research/object_detection/core/post_processing.py on lines 538..541

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 35.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                  if additional_fields is None:
                    ordered_additional_fields = collections.OrderedDict()
                  else:
                    ordered_additional_fields = collections.OrderedDict(
                        sorted(additional_fields.items(), key=lambda item: item[0]))
                Severity: Minor
                Found in research/object_detection/core/post_processing.py and 1 other location - About 45 mins to fix
                research/efficient-hrl/agents/circular_buffer.py on lines 123..125

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 35.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                      if masks is not None:
                        per_class_masks = per_class_masks_list[boxes_idx]
                        boxlist_and_class_scores.add_field(fields.BoxListFields.masks,
                                                           per_class_masks)
                Severity: Minor
                Found in research/object_detection/core/post_processing.py and 1 other location - About 45 mins to fix
                research/object_detection/core/post_processing.py on lines 542..545

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 35.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                    nms_result.add_field(
                        fields.BoxListFields.scores,
                        tf.where(valid_nms_boxes_indices,
                                 selected_scores, -1*tf.ones(max_selection_size)))
                Severity: Minor
                Found in research/object_detection/core/post_processing.py and 1 other location - About 40 mins to fix
                research/object_detection/core/post_processing.py on lines 620..623

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 34.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                      nms_result.add_field(
                          fields.BoxListFields.scores,
                          tf.where(valid_nms_boxes_indices,
                                   selected_scores, -1*tf.ones(max_selection_size)))
                Severity: Minor
                Found in research/object_detection/core/post_processing.py and 1 other location - About 40 mins to fix
                research/object_detection/core/post_processing.py on lines 846..849

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 34.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                    union_area = area1 + tf.transpose(area2,
                                                      [0, 2, 1]) - intersection_area + 1e-8
                Severity: Minor
                Found in research/object_detection/core/post_processing.py and 1 other location - About 40 mins to fix
                official/legacy/detection/utils/box_utils.py on lines 677..677

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 34.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                    if additional_fields is not None:
                      for key, tensor in additional_fields.items():
                        boxlist_and_class_scores.add_field(key, tensor)
                Severity: Minor
                Found in research/object_detection/core/post_processing.py and 1 other location - About 35 mins to fix
                research/object_detection/core/post_processing.py on lines 546..548

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 33.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                      if additional_fields is not None:
                        for key, tensor in additional_fields.items():
                          boxlist_and_class_scores.add_field(key, tensor)
                Severity: Minor
                Found in research/object_detection/core/post_processing.py and 1 other location - About 35 mins to fix
                research/object_detection/core/post_processing.py on lines 773..775

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 33.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                There are no issues that match your filters.

                Category
                Status