tensorflow/models

View on GitHub
research/object_detection/utils/config_util.py

Summary

Maintainability
F
1 wk
Test Coverage

File config_util.py has 1024 lines of code (exceeds 250 allowed). Consider refactoring.
Open

# Copyright 2017 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
Severity: Major
Found in research/object_detection/utils/config_util.py - About 2 days to fix

    Function _maybe_update_config_with_key_value has a Cognitive Complexity of 30 (exceeds 5 allowed). Consider refactoring.
    Open

    def _maybe_update_config_with_key_value(configs, key, value):
      """Checks key type and updates `configs` with the key value pair accordingly.
    
      Args:
        configs: Dictionary of configuration objects. See outputs from
    Severity: Minor
    Found in research/object_detection/utils/config_util.py - About 4 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function update_input_reader_config has a Cognitive Complexity of 25 (exceeds 5 allowed). Consider refactoring.
    Open

    def update_input_reader_config(configs,
                                   key_name=None,
                                   input_name=None,
                                   field_name=None,
                                   value=None,
    Severity: Minor
    Found in research/object_detection/utils/config_util.py - About 3 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function _update_initial_learning_rate has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
    Open

    def _update_initial_learning_rate(configs, learning_rate):
      """Updates `configs` to reflect the new initial learning rate.
    
      This function updates the initial learning rate. For learning rate schedules,
      all other defined learning rates in the pipeline config are scaled to maintain
    Severity: Minor
    Found in research/object_detection/utils/config_util.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function update_input_reader_config has 6 arguments (exceeds 4 allowed). Consider refactoring.
    Open

    def update_input_reader_config(configs,
    Severity: Minor
    Found in research/object_detection/utils/config_util.py - About 45 mins to fix

      Function get_configs_from_multiple_files has 6 arguments (exceeds 4 allowed). Consider refactoring.
      Open

      def get_configs_from_multiple_files(model_config_path="",
      Severity: Minor
      Found in research/object_detection/utils/config_util.py - About 45 mins to fix

        Function merge_external_params_with_configs has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
        Open

        def merge_external_params_with_configs(configs, hparams=None, kwargs_dict=None):
          """Updates `configs` dictionary based on supplied parameters.
        
          This utility is for modifying specific fields in the object detection configs.
          Say that one would like to experiment with different learning rates, momentum
        Severity: Minor
        Found in research/object_detection/utils/config_util.py - About 45 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function check_and_parse_input_config_key has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
        Open

        def check_and_parse_input_config_key(configs, key):
          """Checks key and returns specific fields if key is valid input config update.
        
          Args:
            configs: Dictionary of configuration objects. See outputs from
        Severity: Minor
        Found in research/object_detection/utils/config_util.py - About 45 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function _check_and_convert_legacy_input_config_key has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
        Open

        def _check_and_convert_legacy_input_config_key(key):
          """Checks key and converts legacy input config update to specific update.
        
          Args:
            key: string indicates the target of update operation.
        Severity: Minor
        Found in research/object_detection/utils/config_util.py - About 35 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function get_spatial_image_size has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
        Open

        def get_spatial_image_size(image_resizer_config):
          """Returns expected spatial size of the output image from a given config.
        
          Args:
            image_resizer_config: An image_resizer_pb2.ImageResizer.
        Severity: Minor
        Found in research/object_detection/utils/config_util.py - About 35 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function _update_rescore_instances has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
        Open

        def _update_rescore_instances(model_config, should_rescore):
          """Updates whether boxes should be rescored based on keypoint confidences."""
          if isinstance(should_rescore, str):
            should_rescore = True if should_rescore == "True" else False
          meta_architecture = model_config.WhichOneof("model")
        Severity: Minor
        Found in research/object_detection/utils/config_util.py - About 35 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function remove_unnecessary_ema has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
        Open

        def remove_unnecessary_ema(variables_to_restore, no_ema_collection=None):
          """Remap and Remove EMA variable that are not created during training.
        
          ExponentialMovingAverage.variables_to_restore() returns a map of EMA names
          to tf variables to restore. E.g.:
        Severity: Minor
        Found in research/object_detection/utils/config_util.py - About 25 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function get_configs_from_multiple_files has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
        Open

        def get_configs_from_multiple_files(model_config_path="",
                                            train_config_path="",
                                            train_input_config_path="",
                                            eval_config_path="",
                                            eval_input_config_path="",
        Severity: Minor
        Found in research/object_detection/utils/config_util.py - About 25 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Similar blocks of code found in 3 locations. Consider refactoring.
        Open

        def _update_score_distance_multiplier(model_config, score_distance_multiplier):
          """Updates the keypoint candidate selection metric. See CenterNet proto."""
          meta_architecture = model_config.WhichOneof("model")
          if meta_architecture == "center_net":
            if len(model_config.center_net.keypoint_estimation_task) == 1:
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 2 other locations - About 5 hrs to fix
        research/object_detection/utils/config_util.py on lines 1242..1255
        research/object_detection/utils/config_util.py on lines 1258..1271

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 97.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 3 locations. Consider refactoring.
        Open

        def _update_rescoring_threshold(model_config, rescoring_threshold):
          """Updates the keypoint candidate selection metric. See CenterNet proto."""
          meta_architecture = model_config.WhichOneof("model")
          if meta_architecture == "center_net":
            if len(model_config.center_net.keypoint_estimation_task) == 1:
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 2 other locations - About 5 hrs to fix
        research/object_detection/utils/config_util.py on lines 1226..1239
        research/object_detection/utils/config_util.py on lines 1242..1255

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 97.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 3 locations. Consider refactoring.
        Open

        def _update_std_dev_multiplier(model_config, std_dev_multiplier):
          """Updates the keypoint candidate selection metric. See CenterNet proto."""
          meta_architecture = model_config.WhichOneof("model")
          if meta_architecture == "center_net":
            if len(model_config.center_net.keypoint_estimation_task) == 1:
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 2 other locations - About 5 hrs to fix
        research/object_detection/utils/config_util.py on lines 1226..1239
        research/object_detection/utils/config_util.py on lines 1258..1271

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 97.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 5 locations. Consider refactoring.
        Open

        def _update_box_scale(model_config, box_scale):
          """Updates the keypoint candidate search region. See CenterNet proto."""
          meta_architecture = model_config.WhichOneof("model")
          if meta_architecture == "center_net":
            if len(model_config.center_net.keypoint_estimation_task) == 1:
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 4 other locations - About 4 hrs to fix
        research/object_detection/utils/config_util.py on lines 1116..1124
        research/object_detection/utils/config_util.py on lines 1129..1137
        research/object_detection/utils/config_util.py on lines 1161..1169
        research/object_detection/utils/config_util.py on lines 1186..1194

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 83.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 5 locations. Consider refactoring.
        Open

        def _update_keypoint_candidate_score_threshold(model_config, threshold):
          """Updates the keypoint candidate score threshold. See CenterNet proto."""
          meta_architecture = model_config.WhichOneof("model")
          if meta_architecture == "center_net":
            if len(model_config.center_net.keypoint_estimation_task) == 1:
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 4 other locations - About 4 hrs to fix
        research/object_detection/utils/config_util.py on lines 1116..1124
        research/object_detection/utils/config_util.py on lines 1129..1137
        research/object_detection/utils/config_util.py on lines 1161..1169
        research/object_detection/utils/config_util.py on lines 1174..1182

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 83.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 5 locations. Consider refactoring.
        Open

        def _update_score_distance_offset(model_config, offset):
          """Updates the keypoint candidate selection metric. See CenterNet proto."""
          meta_architecture = model_config.WhichOneof("model")
          if meta_architecture == "center_net":
            if len(model_config.center_net.keypoint_estimation_task) == 1:
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 4 other locations - About 4 hrs to fix
        research/object_detection/utils/config_util.py on lines 1116..1124
        research/object_detection/utils/config_util.py on lines 1129..1137
        research/object_detection/utils/config_util.py on lines 1174..1182
        research/object_detection/utils/config_util.py on lines 1186..1194

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 83.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 5 locations. Consider refactoring.
        Open

        def _update_peak_max_pool_kernel_size(model_config, kernel_size):
          """Updates the max pool kernel size (NMS) for keypoints in CenterNet."""
          meta_architecture = model_config.WhichOneof("model")
          if meta_architecture == "center_net":
            if len(model_config.center_net.keypoint_estimation_task) == 1:
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 4 other locations - About 4 hrs to fix
        research/object_detection/utils/config_util.py on lines 1129..1137
        research/object_detection/utils/config_util.py on lines 1161..1169
        research/object_detection/utils/config_util.py on lines 1174..1182
        research/object_detection/utils/config_util.py on lines 1186..1194

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 83.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 5 locations. Consider refactoring.
        Open

        def _update_candidate_search_scale(model_config, search_scale):
          """Updates the keypoint candidate search scale in CenterNet."""
          meta_architecture = model_config.WhichOneof("model")
          if meta_architecture == "center_net":
            if len(model_config.center_net.keypoint_estimation_task) == 1:
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 4 other locations - About 4 hrs to fix
        research/object_detection/utils/config_util.py on lines 1116..1124
        research/object_detection/utils/config_util.py on lines 1161..1169
        research/object_detection/utils/config_util.py on lines 1174..1182
        research/object_detection/utils/config_util.py on lines 1186..1194

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 83.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def _update_focal_loss_gamma(configs, gamma):
          """Updates the gamma value for a sigmoid focal loss.
        
          The configs dictionary is updated in place, and hence not returned.
        
        
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 1 other location - About 3 hrs to fix
        research/object_detection/utils/config_util.py on lines 932..950

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 62.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def _update_focal_loss_alpha(configs, alpha):
          """Updates the alpha value for a sigmoid focal loss.
        
          The configs dictionary is updated in place, and hence not returned.
        
        
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 1 other location - About 3 hrs to fix
        research/object_detection/utils/config_util.py on lines 911..929

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 62.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 5 locations. Consider refactoring.
        Open

          if train_input_config_path:
            train_input_config = input_reader_pb2.InputReader()
            with tf.gfile.GFile(train_input_config_path, "r") as f:
              text_format.Merge(f.read(), train_input_config)
              configs["train_input_config"] = train_input_config
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 4 other locations - About 2 hrs to fix
        research/lstm_object_detection/utils/config_util.py on lines 101..105
        research/object_detection/utils/config_util.py on lines 280..284
        research/object_detection/utils/config_util.py on lines 286..290
        research/object_detection/utils/config_util.py on lines 298..302

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 53.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 5 locations. Consider refactoring.
        Open

          if eval_config_path:
            eval_config = eval_pb2.EvalConfig()
            with tf.gfile.GFile(eval_config_path, "r") as f:
              text_format.Merge(f.read(), eval_config)
              configs["eval_config"] = eval_config
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 4 other locations - About 2 hrs to fix
        research/lstm_object_detection/utils/config_util.py on lines 101..105
        research/object_detection/utils/config_util.py on lines 280..284
        research/object_detection/utils/config_util.py on lines 286..290
        research/object_detection/utils/config_util.py on lines 292..296

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 53.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 5 locations. Consider refactoring.
        Open

          if model_config_path:
            model_config = model_pb2.DetectionModel()
            with tf.gfile.GFile(model_config_path, "r") as f:
              text_format.Merge(f.read(), model_config)
              configs["model"] = model_config
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 4 other locations - About 2 hrs to fix
        research/lstm_object_detection/utils/config_util.py on lines 101..105
        research/object_detection/utils/config_util.py on lines 286..290
        research/object_detection/utils/config_util.py on lines 292..296
        research/object_detection/utils/config_util.py on lines 298..302

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 53.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 5 locations. Consider refactoring.
        Open

          if train_config_path:
            train_config = train_pb2.TrainConfig()
            with tf.gfile.GFile(train_config_path, "r") as f:
              text_format.Merge(f.read(), train_config)
              configs["train_config"] = train_config
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 4 other locations - About 2 hrs to fix
        research/lstm_object_detection/utils/config_util.py on lines 101..105
        research/object_detection/utils/config_util.py on lines 280..284
        research/object_detection/utils/config_util.py on lines 292..296
        research/object_detection/utils/config_util.py on lines 298..302

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 53.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def get_context_feature_length(model_config):
          """Returns context feature length from a given config.
        
          Args:
            model_config: A model config file.
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 1 other location - About 1 hr to fix
        research/object_detection/utils/config_util.py on lines 88..103

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 46.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def get_max_num_context_features(model_config):
          """Returns maximum number of context features from a given config.
        
          Args:
            model_config: A model config file.
        Severity: Major
        Found in research/object_detection/utils/config_util.py and 1 other location - About 1 hr to fix
        research/object_detection/utils/config_util.py on lines 106..119

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 46.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Identical blocks of code found in 2 locations. Consider refactoring.
        Open

          with tf.gfile.GFile(pipeline_config_path, "r") as f:
            proto_str = f.read()
            text_format.Merge(proto_str, pipeline_config)
        Severity: Minor
        Found in research/object_detection/utils/config_util.py and 1 other location - About 40 mins to fix
        research/lstm_object_detection/utils/config_util.py on lines 44..46

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 34.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Identical blocks of code found in 2 locations. Consider refactoring.
        Open

            if field_name == "input_path":
              path_updater(input_config=target_input_config, input_path=value)
            else:
              setattr(target_input_config, field_name, value)
        Severity: Minor
        Found in research/object_detection/utils/config_util.py and 1 other location - About 30 mins to fix
        research/object_detection/utils/config_util.py on lines 705..708

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 32.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Identical blocks of code found in 2 locations. Consider refactoring.
        Open

            if field_name == "input_path":
              path_updater(input_config=target_input_config, input_path=value)
            else:
              setattr(target_input_config, field_name, value)
        Severity: Minor
        Found in research/object_detection/utils/config_util.py and 1 other location - About 30 mins to fix
        research/object_detection/utils/config_util.py on lines 712..715

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 32.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def _update_mask_type(configs, mask_type):
          """Updates the mask type for both train and eval input readers.
        
          The configs dictionary is updated in place, and hence not returned.
        
        
        Severity: Minor
        Found in research/object_detection/utils/config_util.py and 1 other location - About 30 mins to fix
        research/object_detection/utils/config_util.py on lines 964..975

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 32.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def _update_label_map_path(configs, label_map_path):
          """Updates the label map path for both train and eval input readers.
        
          The configs dictionary is updated in place, and hence not returned.
        
        
        Severity: Minor
        Found in research/object_detection/utils/config_util.py and 1 other location - About 30 mins to fix
        research/object_detection/utils/config_util.py on lines 980..992

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 32.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        There are no issues that match your filters.

        Category
        Status