tensorflow/models

View on GitHub
research/object_detection/meta_architectures/faster_rcnn_meta_arch.py

Summary

Maintainability
F
2 wks
Test Coverage

File faster_rcnn_meta_arch.py has 2599 lines of code (exceeds 250 allowed). Consider refactoring.
Open

# Copyright 2017 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at

    FasterRCNNMetaArch has 48 functions (exceeds 20 allowed). Consider refactoring.
    Open

    class FasterRCNNMetaArch(model.DetectionModel):
      """Faster R-CNN Meta-architecture definition."""
    
      def __init__(self,
                   is_training,
    Severity: Minor
    Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 6 hrs to fix

      Function __init__ has 41 arguments (exceeds 4 allowed). Consider refactoring.
      Open

        def __init__(self,
      Severity: Major
      Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 5 hrs to fix

        Function postprocess has a Cognitive Complexity of 17 (exceeds 5 allowed). Consider refactoring.
        Open

          def postprocess(self, prediction_dict, true_image_shapes):
            """Convert prediction tensors to final detections.
        
            This function converts raw predictions tensors to final detection results.
            See base class for output format conventions.  Note also that by default,
        Severity: Minor
        Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 2 hrs to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function __init__ has 48 lines of code (exceeds 25 allowed). Consider refactoring.
        Open

          def __init__(self,
                       is_training,
                       num_classes,
                       image_resizer_fn,
                       feature_extractor,

          Function _loss_box_classifier has a Cognitive Complexity of 14 (exceeds 5 allowed). Consider refactoring.
          Open

            def _loss_box_classifier(self,
                                     refined_box_encodings,
                                     class_predictions_with_background,
                                     proposal_boxes,
                                     num_proposals,

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Function _loss_box_classifier has 12 arguments (exceeds 4 allowed). Consider refactoring.
          Open

            def _loss_box_classifier(self,

            Function _format_groundtruth_data has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
            Open

              def _format_groundtruth_data(self, image_shapes):
                """Helper function for preparing groundtruth data for target assignment.
            
                In order to be consistent with the model.DetectionModel interface,
                groundtruth boxes are specified in normalized coordinates and classes are

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Function updates has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
            Open

              def updates(self):
                """Returns a list of update operators for this model.
            
                Returns a list of update operators for this model that must be executed at
                each training step. The estimator's train op needs to have a control

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Function regularization_losses has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
            Open

              def regularization_losses(self):
                """Returns a list of regularization losses for this model.
            
                Returns a list of regularization losses for this model that the estimator
                needs to use during training/optimization.

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Function _predict_third_stage has a Cognitive Complexity of 10 (exceeds 5 allowed). Consider refactoring.
            Open

              def _predict_third_stage(self, prediction_dict, image_shapes):
                """Predicts non-box, non-class outputs using refined detections.
            
                For training, masks as predicted directly on the box_classifier_features,
                which are region-features from the initial anchor boxes.

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Function _predict_second_stage has 7 arguments (exceeds 4 allowed). Consider refactoring.
            Open

              def _predict_second_stage(self, rpn_box_encodings,
            Severity: Major
            Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 50 mins to fix

              Function _postprocess_box_classifier has 6 arguments (exceeds 4 allowed). Consider refactoring.
              Open

                def _postprocess_box_classifier(self,
              Severity: Minor
              Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 45 mins to fix

                Function __init__ has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
                Open

                  def __init__(self,
                               is_training,
                               num_classes,
                               image_resizer_fn,
                               feature_extractor,
                Severity: Minor
                Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 45 mins to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                Function _sample_box_classifier_batch has 6 arguments (exceeds 4 allowed). Consider refactoring.
                Open

                  def _sample_box_classifier_batch(
                Severity: Minor
                Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 45 mins to fix

                  Function _get_mask_proposal_boxes_and_classes has 6 arguments (exceeds 4 allowed). Consider refactoring.
                  Open

                    def _get_mask_proposal_boxes_and_classes(
                  Severity: Minor
                  Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 45 mins to fix

                    Function _loss_rpn has 6 arguments (exceeds 4 allowed). Consider refactoring.
                    Open

                      def _loss_rpn(self, rpn_box_encodings,
                    Severity: Minor
                    Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 45 mins to fix

                      Function _box_prediction has 5 arguments (exceeds 4 allowed). Consider refactoring.
                      Open

                        def _box_prediction(self, rpn_features_to_crop, proposal_boxes_normalized,
                      Severity: Minor
                      Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 35 mins to fix

                        Function _postprocess_rpn has 5 arguments (exceeds 4 allowed). Consider refactoring.
                        Open

                          def _postprocess_rpn(self,
                        Severity: Minor
                        Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 35 mins to fix

                          Function _proposal_postprocess has 5 arguments (exceeds 4 allowed). Consider refactoring.
                          Open

                            def _proposal_postprocess(self, rpn_box_encodings,
                          Severity: Minor
                          Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 35 mins to fix

                            Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
                            Open

                              def __init__(self,
                            Severity: Minor
                            Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 35 mins to fix

                              Function _sample_box_classifier_minibatch_single_image has 5 arguments (exceeds 4 allowed). Consider refactoring.
                              Open

                                def _sample_box_classifier_minibatch_single_image(
                              Severity: Minor
                              Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 35 mins to fix

                                Function restore_from_classification_checkpoint_fn has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                                Open

                                  def restore_from_classification_checkpoint_fn(
                                      self,
                                      first_stage_feature_extractor_scope,
                                      second_stage_feature_extractor_scope):
                                    """Returns a map of variables to load from a foreign checkpoint.
                                Severity: Minor
                                Found in research/object_detection/meta_architectures/faster_rcnn_meta_arch.py - About 25 mins to fix

                                Cognitive Complexity

                                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                                A method's cognitive complexity is based on a few simple rules:

                                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                                • Code is considered more complex for each "break in the linear flow of the code"
                                • Code is considered more complex when "flow breaking structures are nested"

                                Further reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                    (batch_cls_targets_with_background, _, _, _,
                                     _) = target_assigner.batch_assign_targets(
                                         target_assigner=self._detector_target_assigner,
                                         anchors_batch=proposal_boxlists,
                                         gt_box_batch=groundtruth_boxlists,
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 2464..2471

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 66.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                      (batch_cls_targets_with_background, batch_cls_weights, batch_reg_targets,
                                       batch_reg_weights, _) = target_assigner.batch_assign_targets(
                                           target_assigner=self._detector_target_assigner,
                                           anchors_batch=proposal_boxlists,
                                           gt_box_batch=groundtruth_boxlists,
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 2676..2683

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 66.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                      if self._output_final_box_rpn_features:
                                        if 'rpn_features_to_crop' not in prediction_dict:
                                          raise ValueError(
                                              'Please make sure rpn_features_to_crop is in the prediction_dict.'
                                          )
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 1550..1561

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 64.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                      if self._output_final_box_features:
                                        if 'rpn_features_to_crop' not in prediction_dict:
                                          raise ValueError(
                                              'Please make sure rpn_features_to_crop is in the prediction_dict.'
                                          )
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 1562..1571

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 64.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                    if self._feature_extractor_for_box_classifier_features:
                                      if (self._feature_extractor_for_box_classifier_features !=
                                          _UNINITIALIZED_FEATURE_EXTRACTOR):
                                        update_ops.extend(
                                            self._feature_extractor_for_box_classifier_features.get_updates_for(
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 2928..2935

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 56.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                    if self._feature_extractor_for_proposal_features:
                                      if (self._feature_extractor_for_proposal_features !=
                                          _UNINITIALIZED_FEATURE_EXTRACTOR):
                                        update_ops.extend(
                                            self._feature_extractor_for_proposal_features.get_updates_for(None))
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 2950..2958

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 56.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 4 locations. Consider refactoring.
                                Open

                                      if self._mask_rcnn_box_predictor.is_keras_model:
                                        mask_predictions = self._mask_rcnn_box_predictor(
                                            [curr_box_classifier_features],
                                            prediction_stage=3)
                                      else:
                                research/object_detection/meta_architectures/context_rcnn_meta_arch.py on lines 503..511
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 1070..1078
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 1212..1220

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 55.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 4 locations. Consider refactoring.
                                Open

                                    if self._mask_rcnn_box_predictor.is_keras_model:
                                      box_predictions = self._mask_rcnn_box_predictor(
                                          [box_classifier_features],
                                          prediction_stage=2)
                                    else:
                                research/object_detection/meta_architectures/context_rcnn_meta_arch.py on lines 503..511
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 1212..1220
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 1246..1254

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 55.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 4 locations. Consider refactoring.
                                Open

                                      if self._mask_rcnn_box_predictor.is_keras_model:
                                        mask_predictions = self._mask_rcnn_box_predictor(
                                            [curr_box_classifier_features],
                                            prediction_stage=3)
                                      else:
                                research/object_detection/meta_architectures/context_rcnn_meta_arch.py on lines 503..511
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 1070..1078
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 1246..1254

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 55.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Identical blocks of code found in 2 locations. Consider refactoring.
                                Open

                                  @property
                                  def anchors(self):
                                    if not self._anchors:
                                      raise RuntimeError('anchors have not been constructed yet!')
                                    if not isinstance(self._anchors, box_list.BoxList):
                                research/object_detection/meta_architectures/ssd_meta_arch.py on lines 445..451

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 54.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                      reshaped_detection_features = tf.reshape(
                                          detection_features_unpooled,
                                          [batch_size, max_detections,
                                           tf.shape(detection_features_unpooled)[1],
                                           tf.shape(detection_features_unpooled)[2],
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 1667..1672

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 54.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                      reshaped_detection_features = tf.reshape(
                                          flattened_detected_feature_maps,
                                          [batch_size, max_detections,
                                           tf.shape(flattened_detected_feature_maps)[1],
                                           tf.shape(flattened_detected_feature_maps)[2],
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 1629..1634

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 54.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                      for scope_name in [first_stage_feature_extractor_scope,
                                                         second_stage_feature_extractor_scope]:
                                        if variable.op.name.startswith(scope_name):
                                          var_name = variable.op.name.replace(scope_name + '/', '')
                                          variables_to_restore[var_name] = variable
                                research/object_detection/models/faster_rcnn_inception_resnet_v2_feature_extractor.py on lines 199..203

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 47.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 5 locations. Consider refactoring.
                                Open

                                    if self._mask_rcnn_box_predictor:
                                      if self._mask_rcnn_box_predictor.is_keras_model:
                                        update_ops.extend(
                                            self._mask_rcnn_box_predictor.get_updates_for(None))
                                        update_ops.extend(
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 2944..2949
                                research/object_detection/meta_architectures/rfcn_meta_arch.py on lines 388..393
                                research/object_detection/meta_architectures/ssd_meta_arch.py on lines 1360..1363
                                research/object_detection/meta_architectures/ssd_meta_arch.py on lines 1364..1367

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 46.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 5 locations. Consider refactoring.
                                Open

                                    if self._first_stage_box_predictor.is_keras_model:
                                      update_ops.extend(
                                          self._first_stage_box_predictor.get_updates_for(None))
                                      update_ops.extend(
                                          self._first_stage_box_predictor.get_updates_for(
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 2959..2965
                                research/object_detection/meta_architectures/rfcn_meta_arch.py on lines 388..393
                                research/object_detection/meta_architectures/ssd_meta_arch.py on lines 1360..1363
                                research/object_detection/meta_architectures/ssd_meta_arch.py on lines 1364..1367

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 46.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Identical blocks of code found in 2 locations. Consider refactoring.
                                Open

                                    prediction_dict = {
                                        'refined_box_encodings': tf.cast(refined_box_encodings,
                                                                         dtype=tf.float32),
                                        'class_predictions_with_background':
                                        tf.cast(class_predictions_with_background, dtype=tf.float32),
                                research/object_detection/meta_architectures/context_rcnn_meta_arch.py on lines 524..532

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 43.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                      normalized_boxes_per_image = box_list_ops.to_normalized_coordinates(
                                          box_list.BoxList(proposal_boxes_per_image), image_shape[0],
                                          image_shape[1], check_range=False).get()
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 2214..2217

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 39.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                    normalized_boxes_per_image = box_list_ops.to_normalized_coordinates(
                                        box_list.BoxList(boxes_per_image),
                                        image_shape[0],
                                        image_shape[1],
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 1764..1766

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 39.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                    if self._feature_extractor_for_box_classifier_features:
                                      box_classifier_features = (
                                          self._feature_extractor_for_box_classifier_features(
                                              flattened_feature_maps))
                                    else:
                                research/object_detection/meta_architectures/context_rcnn_meta_arch.py on lines 602..610

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 37.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Identical blocks of code found in 3 locations. Consider refactoring.
                                Open

                                      if self.groundtruth_has_field(fields.InputDataFields.is_annotated):
                                        losses_mask = tf.stack(self.groundtruth_lists(
                                            fields.InputDataFields.is_annotated))
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 2510..2512
                                research/object_detection/meta_architectures/ssd_meta_arch.py on lines 871..873

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 35.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Identical blocks of code found in 3 locations. Consider refactoring.
                                Open

                                      if self.groundtruth_has_field(fields.InputDataFields.is_annotated):
                                        losses_mask = tf.stack(self.groundtruth_lists(
                                            fields.InputDataFields.is_annotated))
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 2354..2356
                                research/object_detection/meta_architectures/ssd_meta_arch.py on lines 871..873

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 35.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                    proposal_boxes = tf.reshape(box_list_ops.to_absolute_coordinates(
                                        box_list.BoxList(tf.reshape(detection_boxes, [-1, 4])), image_shape[1],
                                        image_shape[2]).get(), [batch, max_num_detections, 4])
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 2599..2601

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 32.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                        flat_normalized_proposals = box_list_ops.to_normalized_coordinates(
                                            box_list.BoxList(tf.reshape(proposal_boxes, [-1, 4])),
                                            image_shape[1], image_shape[2], check_range=False).get()
                                research/object_detection/meta_architectures/faster_rcnn_meta_arch.py on lines 2667..2669

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 32.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                There are no issues that match your filters.

                                Category
                                Status