DataMedSci/beprof

View on GitHub

Showing 8 of 8 total issues

Similar blocks of code found in 2 locations. Consider refactoring.
Open

        try:
            self.y /= factor
        except TypeError as e:
            logger.warning("Division in place is impossible: %s", e)
            if allow_cast:
Severity: Major
Found in beprof/curve.py and 1 other location - About 2 hrs to fix
beprof/profile.py on lines 158..166

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 58.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Similar blocks of code found in 2 locations. Consider refactoring.
Open

        try:
            self.y /= ave
        except TypeError as e:
            logger.warning("Division in place is impossible: %s", e)
            if allow_cast:
Severity: Major
Found in beprof/profile.py and 1 other location - About 2 hrs to fix
beprof/curve.py on lines 107..115

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 58.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

File curve.py has 264 lines of code (exceeds 250 allowed). Consider refactoring.
Open

import numpy as np
import math
import copy
from beprof import functions
import logging
Severity: Minor
Found in beprof/curve.py - About 2 hrs to fix

    Function main has 41 lines of code (exceeds 25 allowed). Consider refactoring.
    Open

    def main():
        print('\nSubtract method :\n')
    
        c = Curve([[0.0, 0], [5, 5], [10, 0]])
        print('c: \n', c)
    Severity: Minor
    Found in beprof/curve.py - About 1 hr to fix

      Function git_version has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
      Open

      def git_version():
          def _minimal_ext_cmd(cmd):
              # construct minimal environment
              env = {}
              for k in ['SYSTEMROOT', 'PATH', 'HOME']:
      Severity: Minor
      Found in setup.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Similar blocks of code found in 2 locations. Consider refactoring.
      Open

          obj = curve1.__class__(np.dstack((coord1, coord2))[0], **curve1.__dict__['metadata'])
      Severity: Minor
      Found in beprof/functions.py and 1 other location - About 50 mins to fix
      beprof/curve.py on lines 159..159

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 36.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 2 locations. Consider refactoring.
      Open

              obj = self.__class__(np.dstack((domain, y))[0], **self.__dict__['metadata'])
      Severity: Minor
      Found in beprof/curve.py and 1 other location - About 50 mins to fix
      beprof/functions.py on lines 23..23

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 36.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Function normalize has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
      Open

          def normalize(self, dt, allow_cast=True):
              """
              Normalize to 1 over [-dt, +dt] area, if allow_cast is set
              to True, division not in place and casting may occur.
              If division in place is not possible and allow_cast is False
      Severity: Minor
      Found in beprof/profile.py - About 25 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Severity
      Category
      Status
      Source
      Language