tensorflow/tensorflow

View on GitHub
tensorflow/python/keras/distribute/distribute_coordinator_utils.py

Summary

Maintainability
F
2 wks
Test Coverage

File distribute_coordinator_utils.py has 533 lines of code (exceeds 250 allowed). Consider refactoring.
Open

# Copyright 2018 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
Severity: Major
Found in tensorflow/python/keras/distribute/distribute_coordinator_utils.py - About 1 day to fix

    Function run_distribute_coordinator has a Cognitive Complexity of 35 (exceeds 5 allowed). Consider refactoring.
    Open

    def run_distribute_coordinator(worker_fn,
                                   strategy,
                                   eval_fn=None,
                                   eval_strategy=None,
                                   cluster_spec=None,
    Severity: Minor
    Found in tensorflow/python/keras/distribute/distribute_coordinator_utils.py - About 5 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function _run_single_worker has 9 arguments (exceeds 4 allowed). Consider refactoring.
    Open

    def _run_single_worker(worker_fn,
    Severity: Major
    Found in tensorflow/python/keras/distribute/distribute_coordinator_utils.py - About 1 hr to fix

      Function run_distribute_coordinator has 9 arguments (exceeds 4 allowed). Consider refactoring.
      Open

      def run_distribute_coordinator(worker_fn,
      Severity: Major
      Found in tensorflow/python/keras/distribute/distribute_coordinator_utils.py - About 1 hr to fix

        Function _run_std_server has a Cognitive Complexity of 10 (exceeds 5 allowed). Consider refactoring.
        Open

        def _run_std_server(cluster_spec=None,
                            task_type=None,
                            task_id=None,
                            session_config=None,
                            rpc_layer=None,
        Severity: Minor
        Found in tensorflow/python/keras/distribute/distribute_coordinator_utils.py - About 1 hr to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function __init__ has 7 arguments (exceeds 4 allowed). Consider refactoring.
        Open

          def __init__(self,
        Severity: Major
        Found in tensorflow/python/keras/distribute/distribute_coordinator_utils.py - About 50 mins to fix

          Function _configure_session_config_for_std_servers has 6 arguments (exceeds 4 allowed). Consider refactoring.
          Open

          def _configure_session_config_for_std_servers(strategy, eval_strategy,
          Severity: Minor
          Found in tensorflow/python/keras/distribute/distribute_coordinator_utils.py - About 45 mins to fix

            Function _run_std_server has 6 arguments (exceeds 4 allowed). Consider refactoring.
            Open

            def _run_std_server(cluster_spec=None,
            Severity: Minor
            Found in tensorflow/python/keras/distribute/distribute_coordinator_utils.py - About 45 mins to fix

              Function _get_master_target has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
              Open

                def _get_master_target(self):
                  """Return the master target for a task."""
                  # If cluster_spec is None or empty, we use local master.
                  if not self._cluster_spec or self._task_type == _TaskType.EVALUATOR:
                    return ""
              Severity: Minor
              Found in tensorflow/python/keras/distribute/distribute_coordinator_utils.py - About 45 mins to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              Function session_creator has 5 arguments (exceeds 4 allowed). Consider refactoring.
              Open

                def session_creator(self,
              Severity: Minor
              Found in tensorflow/python/keras/distribute/distribute_coordinator_utils.py - About 35 mins to fix

                Function _run_single_worker has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                Open

                def _run_single_worker(worker_fn,
                                       strategy,
                                       cluster_spec,
                                       task_type,
                                       task_id,
                Severity: Minor
                Found in tensorflow/python/keras/distribute/distribute_coordinator_utils.py - About 25 mins to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                def _run_std_server(cluster_spec=None,
                                    task_type=None,
                                    task_id=None,
                                    session_config=None,
                                    rpc_layer=None,
                tensorflow/python/distribute/distribute_coordinator.py on lines 381..447

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 354.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                  def _get_master_target(self):
                    """Return the master target for a task."""
                    # If cluster_spec is None or empty, we use local master.
                    if not self._cluster_spec or self._task_type == _TaskType.EVALUATOR:
                      return ""
                tensorflow/python/distribute/distribute_coordinator.py on lines 165..188

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 170.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                  def session_creator(self,
                                      scaffold=None,
                                      config=None,
                                      checkpoint_dir=None,
                                      checkpoint_filename_with_path=None,
                tensorflow/python/distribute/distribute_coordinator.py on lines 214..258

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 141.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                    if task_type in [_TaskType.CHIEF, _TaskType.WORKER]:
                      if strategy.extended.experimental_between_graph:
                        # All jobs run `worker_fn` if between-graph.
                        return _run_single_worker(worker_fn, strategy, cluster_spec, task_type,
                                                  task_id, session_config, rpc_layer)
                tensorflow/python/distribute/distribute_coordinator.py on lines 853..872

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 137.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                def _run_single_worker(worker_fn,
                                       strategy,
                                       cluster_spec,
                                       task_type,
                                       task_id,
                tensorflow/python/distribute/distribute_coordinator.py on lines 322..356

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 137.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                  def __init__(self,
                               strategy,
                               cluster_spec,
                               task_type,
                               task_id,
                tensorflow/python/distribute/distribute_coordinator.py on lines 108..142

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 102.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                  def _is_chief(self):
                    """Return whether the task is the chief worker."""
                    if (not self._cluster_spec or
                        self._task_type in [_TaskType.CHIEF, _TaskType.EVALUATOR, None]):
                      return True
                tensorflow/python/distribute/distribute_coordinator.py on lines 190..201

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 89.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                def _configure_session_config_for_std_servers(strategy, eval_strategy,
                                                              session_config, cluster_spec,
                                                              task_type, task_id):
                  # pylint: disable=g-doc-args
                  """Call strategy's `configure` to mutate the session_config.
                tensorflow/python/distribute/distribute_coordinator.py on lines 525..548

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 82.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                  if not cluster_spec:
                    cluster_spec = tf_config.get("cluster", {})
                    task_env = tf_config.get("task", {})
                    if task_env:
                      task_type = task_env.get("type", task_type)
                tensorflow/python/distribute/distribute_coordinator.py on lines 752..757

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 64.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                def _get_num_workers(cluster_spec):
                  """Gets number of workers including chief."""
                  if not cluster_spec:
                    return 0
                  return len(cluster_spec.as_dict().get(_TaskType.WORKER, [])) + len(
                tensorflow/python/distribute/distribute_coordinator.py on lines 91..96

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 63.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                  if cluster_spec:
                    # TODO(yuefengz): validate cluster_spec.
                    cluster_spec = normalize_cluster_spec(cluster_spec)
                  elif hasattr(strategy.extended, "_cluster_resolver"):
                    cluster_resolver = strategy.extended._cluster_resolver  # pylint: disable=protected-access
                tensorflow/python/distribute/distribute_coordinator.py on lines 759..768

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 62.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                def normalize_cluster_spec(cluster_spec):
                  """Makes `cluster_spec` into a `ClusterSpec` object.
                
                  Args:
                    cluster_spec: a dict, ClusterDef or ClusterSpec object specifying the
                tensorflow/python/distribute/multi_worker_util.py on lines 22..42

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 58.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                  if task_type == _TaskType.EVALUATOR:
                    assert _TaskType.EVALUATOR in new_cluster_spec
                    new_cluster_spec = {
                        _TaskType.EVALUATOR: new_cluster_spec[_TaskType.EVALUATOR]
                    }
                tensorflow/python/distribute/distribute_coordinator.py on lines 371..377

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 55.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                    if (task_type != _TaskType.EVALUATOR and
                        not getattr(strategy.extended, "_std_server_started", False)):
                      # Right now, with eager mode, context is configured with a std server at
                      # the very beginning while with graph mode the std server is started when
                      # distribute coordinator is called. We should consolidate these two paths.
                tensorflow/python/distribute/distribute_coordinator.py on lines 841..846

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 55.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                  if not cluster_spec:
                    # `mode` is ignored in the local case.
                    logging.info("Running local Distribute Coordinator.")
                    _run_single_worker(worker_fn, strategy, None, None, None, session_config,
                                       rpc_layer)
                tensorflow/python/distribute/distribute_coordinator.py on lines 781..790

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 50.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                  @property
                  def distributed_mode(self):
                    """Whether it is distributed training or not."""
                    return bool(self._cluster_spec) and self._task_type != _TaskType.EVALUATOR
                tensorflow/python/distribute/distribute_coordinator.py on lines 271..274

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 35.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Identical blocks of code found in 2 locations. Consider refactoring.
                Open

                  def _debug_message(self):
                    if self._cluster_spec:
                      return "[cluster_spec: %r, task_type: %r, task_id: %r]" % (
                          self._cluster_spec, self.task_type, self.task_id)
                    else:
                tensorflow/python/distribute/distribute_coordinator.py on lines 144..149

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 35.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                There are no issues that match your filters.

                Category
                Status