tensorflow/tensorflow

View on GitHub
tensorflow/python/kernel_tests/nn_ops/losses_test.py

Summary

Maintainability
F
1 mo
Test Coverage

File losses_test.py has 1303 lines of code (exceeds 250 allowed). Consider refactoring.
Open

# Copyright 2016 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
Severity: Major
Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py - About 3 days to fix

    SparseSoftmaxCrossEntropyLossTest has 23 functions (exceeds 20 allowed). Consider refactoring.
    Open

    class SparseSoftmaxCrossEntropyLossTest(test.TestCase):
    
      def testNoneWeightRaisesValueError(self):
        logits = constant_op.constant([[10.0, 0.0, 0.0], [0.0, 10.0, 0.0],
                                       [0.0, 0.0, 10.0]])
    Severity: Minor
    Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py - About 2 hrs to fix

      ComputeWeightedLossTest has 21 functions (exceeds 20 allowed). Consider refactoring.
      Open

      class ComputeWeightedLossTest(test.TestCase):
      
        def setUp(self):
          super(ComputeWeightedLossTest, self).setUp()
          self._shape = (3, 2, 4)
      Severity: Minor
      Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py - About 2 hrs to fix

        Function _test_valid_weights has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
        Open

          def _test_valid_weights(self, weights):
            for reduction in losses.Reduction.all():
              with ops.Graph().as_default() as g:
                self.assertEqual(0, len(util.get_losses()))
                weighted_loss = losses.compute_weighted_loss(
        Severity: Minor
        Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py - About 1 hr to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function testUnweightedFromPlaceholder has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
        Open

          def testUnweightedFromPlaceholder(self):
            for reduction in losses.Reduction.all():
              with ops.Graph().as_default() as g:
                self.assertEqual(0, len(util.get_losses()))
                raw_losses = array_ops.placeholder(dtype=dtypes.float32)
        Severity: Minor
        Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py - About 45 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function testUnweighted has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
        Open

          def testUnweighted(self):
            for reduction in losses.Reduction.all():
              with ops.Graph().as_default() as g:
                self.assertEqual(0, len(util.get_losses()))
                raw_losses = self._raw_losses
        Severity: Minor
        Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py - About 45 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Avoid deeply nested control flow statements.
        Open

                    if reduction == losses.Reduction.NONE:
                      self.assertAllClose(self._raw_losses,
                                          self.evaluate(unweighted_loss))
                    elif reduction == losses.Reduction.SUM:
                      self.assertAllClose(
        Severity: Major
        Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py - About 45 mins to fix

          Avoid deeply nested control flow statements.
          Open

                      if reduction == losses.Reduction.MEAN:
                        self.assertAllClose(weighted_sum / np.sum(broadcast_weights),
                                            self.evaluate(weighted_loss))
                      elif (reduction == losses.Reduction.SUM_OVER_NONZERO_WEIGHTS or
                            reduction == losses.Reduction.SUM_BY_NONZERO_WEIGHTS):
          Severity: Major
          Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py - About 45 mins to fix

            Avoid deeply nested control flow statements.
            Open

                        if reduction == losses.Reduction.NONE:
                          self.assertAllClose(
                              self._raw_losses, unweighted_loss.eval(feed_dict))
                        elif reduction == losses.Reduction.SUM:
                          self.assertAllClose(
            Severity: Major
            Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py - About 45 mins to fix

              Function setUp has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
              Open

                def setUp(self):
                  super(ComputeWeightedLossTest, self).setUp()
                  self._shape = (3, 2, 4)
                  raw_losses = np.zeros(self._shape)
                  next_loss = 0.0
              Severity: Minor
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py - About 25 mins to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              Function setUp has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
              Open

                def setUp(self):
                  super(MeanPairwiseSquaredErrorTest, self).setUp()
                  self._predictions = np.array([[4, 8, 12], [8, 1, 3]])
                  self._labels = np.array([[1, 9, 2], [-5, -5, 7]])
              
              
              Severity: Minor
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py - About 25 mins to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                @test_util.run_deprecated_v1
                def testLossWithSingleDimPlaceholderForLogitsAndWeights1(self):
                  logits = array_ops.placeholder(dtypes.float32, shape=(None, 1))
                  labels = array_ops.placeholder(dtypes.float32, shape=(None, 1))
                  weights = array_ops.ones_like(logits, dtype=dtypes.float32)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 1 day to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 525..540

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 154.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                @test_util.run_deprecated_v1
                def testLossWithSingleDimPlaceholderForLogitsAndWeights2(self):
                  logits = array_ops.placeholder(dtypes.float32, shape=(None, 2))
                  labels = array_ops.placeholder(dtypes.float32, shape=(None, 2))
                  weights = array_ops.ones_like(logits, dtype=dtypes.float32)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 1 day to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 508..523

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 154.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testNonZeroLossWithOneDimBatchSpecificWeightsSomeZero(self):
                  weights = constant_op.constant((1.2, 0), shape=(2, 1))
                  expected_losses = np.multiply(self._expected_losses,
                                                np.asarray([1.2, 1.2, 1.2, 0, 0, 0]).reshape(
                                                    (2, 3)))
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 731..739
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 751..759

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 121.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testNonZeroLossWithOneDimBatchSpecificWeights(self):
                  weights = constant_op.constant((1.2, 3.4), shape=(2, 1))
                  expected_losses = np.multiply(
                      self._expected_losses,
                      np.asarray([1.2, 1.2, 1.2, 3.4, 3.4, 3.4]).reshape((2, 3)))
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 741..749
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 751..759

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 121.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testNonZeroLossWithTwoDimBatchSpecificWeightsSomeZero(self):
                  weights = constant_op.constant([1.2, 0], shape=[2, 1])
                  expected_losses = np.multiply(self._expected_losses,
                                                np.asarray([1.2, 1.2, 1.2, 0, 0, 0]).reshape(
                                                    (2, 3)))
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 731..739
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 741..749

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 121.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testAllWrongAllWeightsMissing(self):
                  logits = constant_op.constant([[10.0, 0.0, 0.0], [0.0, 10.0, 0.0],
                                                 [0.0, 0.0, 10.0]])
                  labels = constant_op.constant([[0, 0, 1], [1, 0, 0], [0, 1, 0]])
                  weights = constant_op.constant([0, 0, 0], shape=[3])
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 184..191

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 119.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testSomeWeightsMissing(self):
                  logits = constant_op.constant([[10.0, 0.0, 0.0], [0.0, 10.0, 0.0],
                                                 [0.0, 0.0, 10.0]])
                  labels = constant_op.constant([[0, 0, 1], [1, 0, 0], [0, 1, 0]])
                  weights = constant_op.constant([1.2, 0, 0], shape=[3])
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 175..182

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 119.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                @test_util.run_deprecated_v1
                def testAllWrongInt32Labels(self):
                  logits = constant_op.constant([[10.0, 0.0, 0.0], [0.0, 10.0, 0.0],
                                                 [0.0, 0.0, 10.0]])
                  labels = constant_op.constant([[2], [0], [1]], dtype=dtypes.int32)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 285..294

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 115.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                @test_util.run_deprecated_v1
                def testAllCorrectInt64Labels(self):
                  with self.cached_session():
                    logits = constant_op.constant([[10.0, 0.0, 0.0], [0.0, 10.0, 0.0],
                                                   [0.0, 0.0, 10.0]])
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 237..245

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 115.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                @test_util.run_deprecated_v1
                def testAllCorrectInt32Labels(self):
                  with self.cached_session():
                    logits = constant_op.constant([[10.0, 0.0, 0.0], [0.0, 10.0, 0.0],
                                                   [0.0, 0.0, 10.0]])
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 254..262

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 115.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                @test_util.run_deprecated_v1
                def testAllWrongInt64Labels(self):
                  logits = constant_op.constant([[10.0, 0.0, 0.0], [0.0, 10.0, 0.0],
                                                 [0.0, 0.0, 10.0]])
                  labels = constant_op.constant([[2], [0], [1]], dtype=dtypes.int64)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 274..283

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 115.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testSomeWeightsMissing(self):
                  logits = constant_op.constant([[10.0, 0.0, 0.0], [0.0, 10.0, 0.0],
                                                 [0.0, 0.0, 10.0]])
                  labels = constant_op.constant([[2], [0], [1]])
                  weights = constant_op.constant([1.2, 0, 0], shape=(3, 1))
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 404..411

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 114.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testAllWrongAllWeightsMissing(self):
                  logits = constant_op.constant([[10.0, 0.0, 0.0], [0.0, 10.0, 0.0],
                                                 [0.0, 0.0, 10.0]])
                  labels = constant_op.constant([[2], [0], [1]])
                  weights = constant_op.constant([0, 0, 0], shape=(3, 1))
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 413..420

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 114.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                      with self.session(g):
                        for unweighted_loss in unweighted_losses:
                          if reduction == losses.Reduction.NONE:
                            self.assertAllClose(self._raw_losses,
                                                self.evaluate(unweighted_loss))
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1425..1437

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 112.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                      with self.session(g):
                        for unweighted_loss in unweighted_losses:
                          if reduction == losses.Reduction.NONE:
                            self.assertAllClose(
                                self._raw_losses, unweighted_loss.eval(feed_dict))
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 7 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1397..1409

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 112.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testZeroLossWhenAllSampleSpecificWeightsAreZero(self):
                  loss = losses.cosine_distance(
                      predictions=constant_op.constant(self._predictions),
                      labels=constant_op.constant(self._labels),
                      dim=2,
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 4 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1333..1340

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 78.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testZeroLossWhenAllMeasurementSpecificWeightsAreZero(self):
                  loss = losses.cosine_distance(
                      predictions=constant_op.constant(self._predictions),
                      labels=constant_op.constant(self._labels),
                      dim=2,
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 4 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1324..1331

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 78.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testNonZeroLossWithOneDimBatchSpecificWeights(self):
                  weights = constant_op.constant([1.2, 3.4], shape=(2, 1))
                  loss = losses.mean_squared_error(self._labels, self._predictions, weights)
                  with self.cached_session():
                    self.assertAlmostEqual(767.8 / 6.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 4 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 978..982

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 75.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testNonZeroLossWithTwoDimBatchSpecificWeights(self):
                  weights = constant_op.constant([1.2, 3.4], shape=[2, 1])
                  loss = losses.mean_squared_error(self._labels, self._predictions, weights)
                  with self.cached_session():
                    self.assertAlmostEqual(767.8 / 6.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 4 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 972..976

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 75.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testNonZeroLossWithSampleSpecificWeightsMostZero(self):
                  weights = constant_op.constant([0, 0, 0, 0, 0, 2], shape=[2, 3])
                  loss = losses.absolute_difference(self._labels, self._predictions, weights)
                  with self.cached_session():
                    self.assertAlmostEqual(6.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 86..90
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 990..994

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 73.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testNonZeroLossWithSampleSpecificWeightsMostZero(self):
                  weights = constant_op.constant([0, 0, 0, 0, 0, 2], shape=[2, 3])
                  loss = losses.mean_squared_error(self._labels, self._predictions, weights)
                  with self.cached_session():
                    self.assertAlmostEqual(18.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 86..90
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 92..96

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 73.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testNonZeroLossWithSampleSpecificWeights(self):
                  weights = constant_op.constant([3, 6, 5, 0, 4, 2], shape=[2, 3])
                  loss = losses.absolute_difference(self._labels, self._predictions, weights)
                  with self.cached_session():
                    self.assertAlmostEqual(16.6, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 92..96
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 990..994

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 73.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testIncompatibleShapes(self):
                  with self.cached_session():
                    predictions = constant_op.constant([[-1.0], [2.1]])
                    labels = constant_op.constant([0.0, 1.0])
                    with self.assertRaises(ValueError):
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 828..833

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 72.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testIncompatibleShapes(self):
                  with self.cached_session():
                    logits = constant_op.constant([[-1.0], [2.1]])
                    labels = constant_op.constant([0.0, 1.0])
                    with self.assertRaises(ValueError):
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 866..871

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 72.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def setUp(self):
                  super(MeanSquaredErrorTest, self).setUp()
                  self._predictions = constant_op.constant([4, 8, 12, 8, 1, 3], shape=(2, 3))
                  self._labels = constant_op.constant([1, 9, 2, -5, -2, 6], shape=(2, 3))
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 40..43

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 72.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def setUp(self):
                  super(AbsoluteDifferenceLossTest, self).setUp()
                  self._predictions = constant_op.constant([4, 8, 12, 8, 1, 3], shape=(2, 3))
                  self._labels = constant_op.constant([1, 9, 2, -5, -2, 6], shape=(2, 3))
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 926..929

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 72.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testNonZeroLossWithOneDimBatchSpecificWeights(self):
                  weights = constant_op.constant((1.2, 0.0), shape=(2, 1))
                  loss = losses.absolute_difference(self._labels, self._predictions, weights)
                  with self.cached_session():
                    self.assertAlmostEqual(5.6, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 80..84

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 69.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testNonZeroLossWithTwoDimBatchSpecificWeights(self):
                  weights = constant_op.constant([1.2, 0.0], shape=[2, 1])
                  loss = losses.absolute_difference(self._labels, self._predictions, weights)
                  with self.cached_session():
                    self.assertAlmostEqual(5.6, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 74..78

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 69.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 2 locations. Consider refactoring.
              Open

                  with self.cached_session() as sess:
                    loss = sess.run(loss, feed_dict={tf_predictions: self._np_predictions})
                    self.assertAlmostEqual(weights * -np.sum(self._expected_losses) / 6.0,
                                           loss, 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 716..719

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 66.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 2 locations. Consider refactoring.
              Open

                  with self.cached_session() as sess:
                    loss = sess.run(loss, feed_dict={tf_predictions: self._np_predictions})
                    self.assertAlmostEqual(weights * -np.sum(self._expected_losses) / 6.0,
                                           loss, 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 726..729

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 66.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testAllCorrectNoWeights(self):
                  loss = losses.cosine_distance(
                      predictions=constant_op.constant(self._labels),
                      labels=constant_op.constant(self._labels),
                      dim=2)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1266..1272

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 66.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testPartiallyCorrectWithIntegerValues(self):
                  loss = losses.cosine_distance(
                      predictions=constant_op.constant(self._predictions),
                      labels=constant_op.constant(self._labels),
                      dim=2)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1258..1264

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 66.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                  with self.cached_session():
                    loss = losses.softmax_cross_entropy(labels, logits, weights)
                    self.assertAlmostEqual((1.2 + 3.4 + 5.6) * 10.0 / 3.0,
                                           self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 389..392
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 399..402

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 62.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                  with self.cached_session():
                    loss = losses.sparse_softmax_cross_entropy(labels, logits, weights)
                    self.assertAlmostEqual((1.2 + 3.4 + 5.6) * 10.0 / 3.0,
                                           self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 170..173
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 389..392

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 62.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                  with self.cached_session():
                    loss = losses.sparse_softmax_cross_entropy(labels, logits, weights)
                    self.assertAlmostEqual((1.2 + 3.4 + 5.6) * 10.0 / 3.0,
                                           self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 170..173
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 399..402

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 62.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testLossWithSampleSpecificWeightsAllZero(self):
                  weights = array_ops.zeros((2, 3))
                  loss = losses.absolute_difference(self._labels, self._predictions, weights)
                  with self.cached_session():
                    self.assertAlmostEqual(0.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 996..1000

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 62.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testLossWithSampleSpecificWeightsAllZero(self):
                  weights = array_ops.zeros((2, 3))
                  loss = losses.mean_squared_error(self._labels, self._predictions, weights)
                  with self.cached_session():
                    self.assertAlmostEqual(0.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 3 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 98..102

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 62.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testNonZeroLossWithScalarTensorWeight(self):
                  weights = 2.3
                  loss = losses.absolute_difference(self._labels, self._predictions,
                                                    constant_op.constant(weights))
                  with self.cached_session():
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 2 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 964..970

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 60.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                @test_util.run_deprecated_v1
                def testNonZeroLossWithScalarTensorWeight(self):
                  weights = 2.3
                  loss = losses.mean_squared_error(self._labels, self._predictions,
                                                   constant_op.constant(weights))
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 2 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 67..72

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 60.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                def testNonZeroLossWithPythonScalarWeight(self):
                  weights = 2.3
                  loss = losses.absolute_difference(self._labels, self._predictions, weights)
                  with self.cached_session():
                    self.assertAlmostEqual(5.5 * weights, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 2 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 957..962

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 54.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                @test_util.run_deprecated_v1
                def testNonZeroLossWithPythonScalarWeight(self):
                  weights = 2.3
                  loss = losses.mean_squared_error(self._labels, self._predictions, weights)
                  with self.cached_session():
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 2 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 61..65

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 54.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                  self._labels = np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0], [1, 0, 0],
                                             [0, 0, 1], [0, 1, 0]]).reshape((3, 2, 3))
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 2 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1237..1244

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 54.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                  self._predictions = np.asarray([
                      [1, 0, 0],  # Batch 1
                      [0, 0, -1],
                      [1, 0, 0],  # Batch 2
                      [1, 0, 0],
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 2 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1246..1247

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 54.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                @test_util.run_deprecated_v1
                def testNonZeroLoss(self):
                  loss = losses.mean_squared_error(self._labels, self._predictions)
                  with self.cached_session():
                    self.assertAlmostEqual(49.5, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 2 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 945..949

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 51.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                @test_util.run_deprecated_v1
                def testAllCorrectNoLossWeight(self):
                  loss = losses.mean_squared_error(self._predictions, self._predictions)
                  with self.cached_session():
                    self.assertAlmostEqual(0.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 2 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 951..955

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 51.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                  with self.cached_session():
                    loss = losses.softmax_cross_entropy(labels, logits,
                                                        constant_op.constant(weights))
                    self.assertAlmostEqual(weights * 10.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 2 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 323..326

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 50.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                  with self.cached_session():
                    loss = losses.sparse_softmax_cross_entropy(labels, logits,
                                                               constant_op.constant(weights))
                    self.assertAlmostEqual(weights * 10.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 2 hrs to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 160..163

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 50.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 2 locations. Consider refactoring.
              Open

                  with self.cached_session():
                    self.assertAlmostEqual(weights * -np.sum(self._expected_losses) / 6.0,
                                           self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 698..700

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 49.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 2 locations. Consider refactoring.
              Open

                  with self.cached_session():
                    self.assertAlmostEqual(weights * -np.sum(self._expected_losses) / 6.0,
                                           self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 706..708

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 49.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testAllCorrectNoLossWeight(self):
                  loss = losses.absolute_difference(self._predictions, self._predictions)
                  with self.cached_session():
                    self.assertAlmostEqual(0.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 56..59
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 676..679

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 47.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testNonZeroLoss(self):
                  loss = losses.absolute_difference(self._labels, self._predictions)
                  with self.cached_session():
                    self.assertAlmostEqual(5.5, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 51..54
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 676..679

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 47.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testAllCorrectNoLossWeight(self):
                  loss = losses.log_loss(self._labels, self._labels)
                  with self.cached_session():
                    self.assertAlmostEqual(0.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 51..54
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 56..59

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 47.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 9 locations. Consider refactoring.
              Open

                  predictions = np.array([
                      [[4, 8, 12], [1, 2, 3], [4, 5, 6]],
                      [[8, 1, 3], [7, 8, 9], [10, 11, 12]],
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 8 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1115..1117
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1126..1128
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1130..1132
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1164..1166
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1168..1170
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1176..1178
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1180..1182
              tensorflow/python/kernel_tests/signal/shape_ops_test.py on lines 193..195

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 46.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 9 locations. Consider refactoring.
              Open

                  predictions = np.array([
                      [[4, 8, 12], [1, 2, 3], [4, 5, 6]],
                      [[8, 1, 3], [7, 8, 9], [10, 11, 12]],
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 8 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1115..1117
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1119..1121
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1126..1128
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1130..1132
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1164..1166
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1176..1178
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1180..1182
              tensorflow/python/kernel_tests/signal/shape_ops_test.py on lines 193..195

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 46.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 9 locations. Consider refactoring.
              Open

                  predictions = np.array([
                      [[4, 8, 12], [1, 2, 3], [4, 5, 6]],
                      [[8, 1, 3], [7, 8, 9], [10, 11, 12]],
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 8 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1115..1117
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1119..1121
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1126..1128
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1164..1166
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1168..1170
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1176..1178
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1180..1182
              tensorflow/python/kernel_tests/signal/shape_ops_test.py on lines 193..195

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 46.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 9 locations. Consider refactoring.
              Open

                  labels = np.array([
                      [[1, 9, 2], [12, 11, 10], [9, 8, 7]],
                      [[-5, -5, 7], [6, 5, 4], [3, 2, 1]],
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 8 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1119..1121
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1126..1128
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1130..1132
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1164..1166
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1168..1170
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1176..1178
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1180..1182
              tensorflow/python/kernel_tests/signal/shape_ops_test.py on lines 193..195

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 46.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 9 locations. Consider refactoring.
              Open

                  labels = np.array([
                      [[1, 9, 2], [12, 11, 10], [9, 8, 7]],
                      [[-5, -5, 7], [6, 5, 4], [3, 2, 1]],
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 8 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1115..1117
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1119..1121
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1126..1128
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1130..1132
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1164..1166
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1168..1170
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1180..1182
              tensorflow/python/kernel_tests/signal/shape_ops_test.py on lines 193..195

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 46.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 9 locations. Consider refactoring.
              Open

                  labels = np.array([
                      [[1, 9, 2], [12, 11, 10], [9, 8, 7]],
                      [[-5, -5, 7], [6, 5, 4], [3, 2, 1]],
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 8 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1115..1117
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1119..1121
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1130..1132
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1164..1166
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1168..1170
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1176..1178
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1180..1182
              tensorflow/python/kernel_tests/signal/shape_ops_test.py on lines 193..195

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 46.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 9 locations. Consider refactoring.
              Open

                  predictions = np.array([
                      [[4, 8, 12], [1, 2, 3], [4, 5, 6]],
                      [[8, 1, 3], [7, 8, 9], [10, 11, 12]],
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 8 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1115..1117
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1119..1121
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1126..1128
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1130..1132
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1164..1166
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1168..1170
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1176..1178
              tensorflow/python/kernel_tests/signal/shape_ops_test.py on lines 193..195

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 46.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 9 locations. Consider refactoring.
              Open

                  labels = np.array([
                      [[1, 9, 2], [12, 11, 10], [9, 8, 7]],
                      [[-5, -5, 7], [6, 5, 4], [3, 2, 1]],
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 8 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1115..1117
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1119..1121
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1126..1128
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1130..1132
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1168..1170
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1176..1178
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1180..1182
              tensorflow/python/kernel_tests/signal/shape_ops_test.py on lines 193..195

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 46.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                  with self.cached_session():
                    loss = losses.softmax_cross_entropy(labels, logits, weights)
                    self.assertAlmostEqual(weights * 10.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 313..315

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 44.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                  with self.cached_session():
                    loss = losses.sparse_softmax_cross_entropy(labels, logits, weights)
                    self.assertAlmostEqual(weights * 10.0, self.evaluate(loss), 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 150..152

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 44.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testValueErrorThrownWhenWeightIsNone(self):
                  with self.cached_session():
                    with self.assertRaises(ValueError):
                      losses.absolute_difference(
                          self._predictions, self._predictions, weights=None)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 671..674
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 931..935

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 42.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testValueErrorThrownWhenWeightIsNone(self):
                  with self.cached_session():
                    with self.assertRaises(ValueError):
                      losses.log_loss(self._labels, self._labels, weights=None)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 45..49
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 931..935

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 42.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                def testValueErrorThrownWhenWeightIsNone(self):
                  with self.cached_session():
                    with self.assertRaises(ValueError):
                      losses.mean_squared_error(
                          self._predictions, self._predictions, weights=None)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 45..49
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 671..674

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 42.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                        y = self._labels[b, i].item() - self._labels[b, j].item()
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1018..1018

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 41.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                        x = self._predictions[b, i].item() - self._predictions[b, j].item()
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 1 hr to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1019..1019

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 41.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 5 locations. Consider refactoring.
              Open

                    logits = constant_op.constant([[100.0, -100.0, -100.0, -100.0],
                                                   [-100.0, 100.0, -100.0, -100.0],
                                                   [-100.0, -100.0, 100.0, -100.0],
                                                   [-100.0, -100.0, -100.0, 100.0]])
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 4 other locations - About 50 mins to fix
              tensorflow/python/kernel_tests/image_ops/draw_bounding_box_op_test.py on lines 125..126
              tensorflow/python/kernel_tests/image_ops/draw_bounding_box_op_test.py on lines 132..133
              tensorflow/python/kernel_tests/linalg/tridiagonal_solve_op_test.py on lines 605..606
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 480..483

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 36.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 5 locations. Consider refactoring.
              Open

                    logits = constant_op.constant([[100.0, -100.0, -100.0, -100.0],
                                                   [-100.0, 100.0, -100.0, -100.0],
                                                   [-100.0, -100.0, 100.0, -100.0],
                                                   [-100.0, -100.0, -100.0, 100.0]])
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 4 other locations - About 50 mins to fix
              tensorflow/python/kernel_tests/image_ops/draw_bounding_box_op_test.py on lines 125..126
              tensorflow/python/kernel_tests/image_ops/draw_bounding_box_op_test.py on lines 132..133
              tensorflow/python/kernel_tests/linalg/tridiagonal_solve_op_test.py on lines 605..606
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 465..468

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 36.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 2 locations. Consider refactoring.
              Open

                    with self.cached_session():
                      with self.assertRaisesRegex(errors_impl.OpError, expected_error_msg):
                        weighted_loss.eval(feed_dict={weights_placeholder: weights})
              Severity: Minor
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 50 mins to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1464..1466

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 36.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Identical blocks of code found in 2 locations. Consider refactoring.
              Open

                    with self.cached_session():
                      with self.assertRaisesRegex(errors_impl.OpError, expected_error_msg):
                        weighted_loss.eval(feed_dict={weights_placeholder: weights})
              Severity: Minor
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 50 mins to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 1487..1489

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 36.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                    self.assertAlmostEqual((1.2 + 3.4 + 5.6) * 10.0 / 3.0, loss_val, 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 45 mins to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 365..365
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 382..382

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 35.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                    self.assertAlmostEqual((1.0 + 1.0 + 1.0) * 10.0 / 3.0, loss_val, 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 45 mins to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 349..349
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 382..382

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 35.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 3 locations. Consider refactoring.
              Open

                    self.assertAlmostEqual((1.2 + 3.4 + 5.6) * 10.0 / 3.0, loss_val, 3)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 2 other locations - About 45 mins to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 349..349
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 365..365

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 35.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 8 locations. Consider refactoring.
              Open

                  labels = constant_op.constant((
                      (1, 0, 1), (1, 1, 0), (0, 1, 1)
                  ), dtype=dtypes.int64)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 7 other locations - About 35 mins to fix
              tensorflow/python/kernel_tests/array_ops/one_hot_op_test.py on lines 238..239
              tensorflow/python/kernel_tests/array_ops/one_hot_op_test.py on lines 244..244
              tensorflow/python/kernel_tests/array_ops/one_hot_op_test.py on lines 250..250
              tensorflow/python/kernel_tests/array_ops/one_hot_op_test.py on lines 258..258
              tensorflow/python/kernel_tests/array_ops/one_hot_op_test.py on lines 264..264
              tensorflow/python/kernel_tests/array_ops/stack_op_test.py on lines 365..366
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 581..585

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 33.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 8 locations. Consider refactoring.
              Open

                  logits = constant_op.constant((
                      (100.0, -100.0, 100.0),
                      (100.0, -100.0, 100.0),
                      (100.0, 100.0, -100.0)
                  ), dtype=dtypes.float64)
              Severity: Major
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 7 other locations - About 35 mins to fix
              tensorflow/python/kernel_tests/array_ops/one_hot_op_test.py on lines 238..239
              tensorflow/python/kernel_tests/array_ops/one_hot_op_test.py on lines 244..244
              tensorflow/python/kernel_tests/array_ops/one_hot_op_test.py on lines 250..250
              tensorflow/python/kernel_tests/array_ops/one_hot_op_test.py on lines 258..258
              tensorflow/python/kernel_tests/array_ops/one_hot_op_test.py on lines 264..264
              tensorflow/python/kernel_tests/array_ops/stack_op_test.py on lines 365..366
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 586..588

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 33.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                  with self.cached_session():
                    with self.assertRaises(ValueError):
                      losses.softmax_cross_entropy(labels, logits, weights=None)
              Severity: Minor
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 30 mins to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 233..235

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 32.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                  with self.cached_session():
                    with self.assertRaises(ValueError):
                      losses.sparse_softmax_cross_entropy(labels, logits, weights=None)
              Severity: Minor
              Found in tensorflow/python/kernel_tests/nn_ops/losses_test.py and 1 other location - About 30 mins to fix
              tensorflow/python/kernel_tests/nn_ops/losses_test.py on lines 119..121

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 32.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              There are no issues that match your filters.

              Category
              Status