tensorflow/tensorflow

View on GitHub
tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py

Summary

Maintainability
F
1 wk
Test Coverage

File rnn_cell_impl.py has 1120 lines of code (exceeds 250 allowed). Consider refactoring.
Open

# Copyright 2015 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
Severity: Major
Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 2 days to fix

    Function _concat has a Cognitive Complexity of 25 (exceeds 5 allowed). Consider refactoring.
    Open

    def _concat(prefix, suffix, static=False):
      """Concat that enables int, Tensor, or TensorShape values.
    
      This function takes a size specification, which can be an integer, a
      TensorShape, or a Tensor, and converts it into a concatenated Tensor
    Severity: Minor
    Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 3 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function __init__ has 15 arguments (exceeds 4 allowed). Consider refactoring.
    Open

      def __init__(self,
    Severity: Major
    Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 1 hr to fix

      Function get_initial_state has a Cognitive Complexity of 14 (exceeds 5 allowed). Consider refactoring.
      Open

        def get_initial_state(self, inputs=None, batch_size=None, dtype=None):
          if inputs is not None:
            # Validate the given batch_size and dtype against inputs if provided.
            inputs = tensor_conversion.convert_to_tensor_v2_with_dispatch(
                inputs, name="inputs"
      Severity: Minor
      Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function __init__ has a Cognitive Complexity of 13 (exceeds 5 allowed). Consider refactoring.
      Open

        def __init__(self,
                     num_units,
                     use_peepholes=False,
                     cell_clip=None,
                     initializer=None,
      Severity: Minor
      Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function call has a Cognitive Complexity of 13 (exceeds 5 allowed). Consider refactoring.
      Open

        def call(self, inputs, state):
          """Run one step of LSTM.
      
          Args:
            inputs: input Tensor, must be 2-D, `[batch, input_size]`.
      Severity: Minor
      Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function __init__ has 8 arguments (exceeds 4 allowed). Consider refactoring.
      Open

        def __init__(self,
      Severity: Major
      Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 1 hr to fix

        Function __init__ has 8 arguments (exceeds 4 allowed). Consider refactoring.
        Open

          def __init__(self,
        Severity: Major
        Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 1 hr to fix

          Function build has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
          Open

            def build(self, inputs_shape):
              if inputs_shape[-1] is None:
                raise ValueError("Expected inputs.shape[-1] to be known, saw shape: %s" %
                                 str(inputs_shape))
              _check_supported_dtypes(self.dtype)
          Severity: Minor
          Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 55 mins to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Function non_trainable_weights has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
          Open

            def non_trainable_weights(self):
              weights = []
              for cell in self._cells:
                if isinstance(cell, base_layer.Layer):
                  weights += cell.non_trainable_weights
          Severity: Minor
          Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 55 mins to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Function __init__ has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
          Open

            def __init__(self, cells, state_is_tuple=True):
              """Create a RNN cell composed sequentially of a number of RNNCells.
          
              Args:
                cells: list of RNNCells that will be composed in this order.
          Severity: Minor
          Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 55 mins to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Function __init__ has 6 arguments (exceeds 4 allowed). Consider refactoring.
          Open

            def __init__(self,
          Severity: Minor
          Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 45 mins to fix

            Function call has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
            Open

              def call(self, inputs, state):
                """Run this multi-layer cell on inputs, starting from state."""
                cur_state_pos = 0
                cur_inp = inputs
                new_states = []
            Severity: Minor
            Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 45 mins to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Function __call__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
            Open

              def __call__(self, inputs, state, scope=None, *args, **kwargs):
            Severity: Minor
            Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 35 mins to fix

              Function _rnn_get_variable has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
              Open

                def _rnn_get_variable(self, getter, *args, **kwargs):
                  variable = getter(*args, **kwargs)
                  if ops.executing_eagerly_outside_functions():
                    trainable = variable.trainable
                  else:
              Severity: Minor
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 25 mins to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              Function zero_state has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
              Open

                def zero_state(self, batch_size, dtype):
                  """Return zero-filled state tensor(s).
              
                  Args:
                    batch_size: int, float, or unit Tensor representing the batch size.
              Severity: Minor
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py - About 25 mins to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              Identical blocks of code found in 2 locations. Consider refactoring.
              Open

              def _concat(prefix, suffix, static=False):
                """Concat that enables int, Tensor, or TensorShape values.
              
                This function takes a size specification, which can be an integer, a
                TensorShape, or a Tensor, and converts it into a concatenated Tensor
              Severity: Major
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 1 other location - About 3 days to fix
              tensorflow/python/ops/rnn_cell_impl.py on lines 65..133

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 407.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 2 locations. Consider refactoring.
              Open

              def assert_like_rnncell(cell_name, cell):
                """Raises a TypeError if cell is not like an RNNCell.
              
                NOTE: Do not rely on the error message (in particular in tests) which can be
                subject to change to increase readability. Use
              Severity: Major
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 1 other location - About 7 hrs to fix
              tensorflow/python/ops/rnn_cell_impl.py on lines 145..177

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 119.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 2 locations. Consider refactoring.
              Open

              def _zero_state_tensors(state_size, batch_size, dtype):
                """Create tensors of zeros based on state_size, batch_size, and dtype."""
              
                def get_state_shape(s):
                  """Combine s with batch_size to get a proper tensor shape."""
              Severity: Major
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 1 other location - About 5 hrs to fix
              tensorflow/python/ops/rnn_cell_impl.py on lines 50..62

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 93.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def get_config(self):
                  config = {
                      "cell": {
                          "class_name": self.cell.__class__.__name__,
                          "config": self.cell.get_config()
              Severity: Major
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 1 other location - About 4 hrs to fix
              tensorflow/python/keras/layers/rnn_cell_wrapper_v2.py on lines 76..84

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 76.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

              @tf_export(v1=["nn.rnn_cell.ResidualWrapper"])
              class ResidualWrapper(rnn_cell_wrapper_impl.ResidualWrapperBase,
                                    _RNNCellWrapperV1):
                """RNNCell wrapper that ensures cell inputs are added to the outputs."""
              
              
              Severity: Major
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 1 other location - About 2 hrs to fix
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 1191..1199

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 55.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

              @tf_export(v1=["nn.rnn_cell.DropoutWrapper"])
              class DropoutWrapper(rnn_cell_wrapper_impl.DropoutWrapperBase,
                                   _RNNCellWrapperV1):
                """Operator adding dropout to inputs and outputs of the given cell."""
              
              
              Severity: Major
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 1 other location - About 2 hrs to fix
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 1202..1210

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 55.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 4 locations. Consider refactoring.
              Open

                  if inputs_shape[-1] is None:
                    raise ValueError("Expected inputs.shape[-1] to be known, saw shape: %s" %
                                     str(inputs_shape))
              Severity: Major
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 3 other locations - About 35 mins to fix
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 460..462
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 748..750
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 971..973

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 33.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 4 locations. Consider refactoring.
              Open

                  if inputs_shape[-1] is None:
                    raise ValueError("Expected inputs.shape[-1] to be known, saw shape: %s" %
                                     str(inputs_shape))
              Severity: Major
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 3 other locations - About 35 mins to fix
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 570..572
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 748..750
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 971..973

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 33.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 4 locations. Consider refactoring.
              Open

                  if inputs_shape[-1] is None:
                    raise ValueError("Expected inputs.shape[-1] to be known, saw shape: %s" %
                                     str(inputs_shape))
              Severity: Major
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 3 other locations - About 35 mins to fix
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 460..462
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 570..572
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 971..973

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 33.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 4 locations. Consider refactoring.
              Open

                  if inputs_shape[-1] is None:
                    raise ValueError("Expected inputs.shape[-1] to be known, saw shape: %s" %
                                     str(inputs_shape))
              Severity: Major
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 3 other locations - About 35 mins to fix
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 460..462
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 570..572
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 748..750

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 33.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                  if self._cell_clip is not None:
                    # pylint: disable=invalid-unary-operand-type
                    c = clip_ops.clip_by_value(c, -self._cell_clip, self._cell_clip)
              Severity: Minor
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 1 other location - About 35 mins to fix
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 1081..1083

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 33.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                    if self._proj_clip is not None:
                      # pylint: disable=invalid-unary-operand-type
                      m = clip_ops.clip_by_value(m, -self._proj_clip, self._proj_clip)
              Severity: Minor
              Found in tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py and 1 other location - About 35 mins to fix
              tensorflow/python/keras/layers/legacy_rnn/rnn_cell_impl.py on lines 1069..1071

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 33.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              There are no issues that match your filters.

              Category
              Status