KarrLab/pkg_utils

View on GitHub
pkg_utils/core.py

Summary

Maintainability
F
3 days
Test Coverage
A
100%

File core.py has 385 lines of code (exceeds 250 allowed). Consider refactoring.
Open

""" Utilities for linking setuptools with package version metadata, 
GitHub README.md files, requirements.txt files, and restoring overridden
entry points during for editable installations.

:Author: Jonathan Karr <jonrkarr@gmail.com>
Severity: Minor
Found in pkg_utils/core.py - About 5 hrs to fix

    Function parse_requirement_line has a Cognitive Complexity of 24 (exceeds 5 allowed). Consider refactoring.
    Open

    def parse_requirement_line(line, include_uri=False, include_extras=True, include_specs=True, include_markers=True):
        """ Parse lines from a requirements.txt file into list of requirements and dependency links
    
        Args:
            line (:obj:`str`): line from a requirements.txt file
    Severity: Minor
    Found in pkg_utils/core.py - About 3 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function parse_optional_requirements_file has a Cognitive Complexity of 23 (exceeds 5 allowed). Consider refactoring.
    Open

    def parse_optional_requirements_file(filename, include_uri=False, include_extras=True, include_specs=True, include_markers=True):
        """ Parse a requirements.optional.txt file into list of requirements and dependency links
    
        Args:
            filename (:obj:`str`): path to requirements.txt file
    Severity: Minor
    Found in pkg_utils/core.py - About 3 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function expand_package_data_filename_patterns has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
    Open

    def expand_package_data_filename_patterns(dirname, package_data_filename_patterns=None):
        """ Expand the package data filenames
    
        Args:
            :obj:`dict`: package data
    Severity: Minor
    Found in pkg_utils/core.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function get_dependencies has 31 lines of code (exceeds 25 allowed). Consider refactoring.
    Open

    def get_dependencies(dirname, include_uri=False, include_extras=True, include_specs=True, include_markers=True):
        """ Parse required and optional dependencies from requirements.txt files
    
        Args:
            dirname (:obj:`str`): path to the package
    Severity: Minor
    Found in pkg_utils/core.py - About 1 hr to fix

      Avoid deeply nested control flow statements.
      Open

                          if not match:
                              raise ValueError(
                                  'Could not parse optional dependency: {}'.format(line))
                          option = match.group(1)
      Severity: Major
      Found in pkg_utils/core.py - About 45 mins to fix

        Avoid deeply nested control flow statements.
        Open

                            if option not in extras_require:
                                extras_require[option] = []
                            extras_require[option] += tmp1
        Severity: Major
        Found in pkg_utils/core.py - About 45 mins to fix

          Avoid deeply nested control flow statements.
          Open

                              if option is None:
                                  raise ValueError(
                                      "Required dependencies should not be placed in an optional dependencies file: {}".format(line))
                              tmp1, tmp2 = parse_requirement_lines([line],
          Severity: Major
          Found in pkg_utils/core.py - About 45 mins to fix

            Function get_dependencies has 5 arguments (exceeds 4 allowed). Consider refactoring.
            Open

            def get_dependencies(dirname, include_uri=False, include_extras=True, include_specs=True, include_markers=True):
            Severity: Minor
            Found in pkg_utils/core.py - About 35 mins to fix

              Function parse_requirement_lines has 5 arguments (exceeds 4 allowed). Consider refactoring.
              Open

              def parse_requirement_lines(lines, include_uri=False, include_extras=True, include_specs=True, include_markers=True):
              Severity: Minor
              Found in pkg_utils/core.py - About 35 mins to fix

                Function parse_optional_requirements_file has 5 arguments (exceeds 4 allowed). Consider refactoring.
                Open

                def parse_optional_requirements_file(filename, include_uri=False, include_extras=True, include_specs=True, include_markers=True):
                Severity: Minor
                Found in pkg_utils/core.py - About 35 mins to fix

                  Function parse_requirement_line has 5 arguments (exceeds 4 allowed). Consider refactoring.
                  Open

                  def parse_requirement_line(line, include_uri=False, include_extras=True, include_specs=True, include_markers=True):
                  Severity: Minor
                  Found in pkg_utils/core.py - About 35 mins to fix

                    Function parse_requirements_file has 5 arguments (exceeds 4 allowed). Consider refactoring.
                    Open

                    def parse_requirements_file(filename, include_uri=False, include_extras=True, include_specs=True, include_markers=True):
                    Severity: Minor
                    Found in pkg_utils/core.py - About 35 mins to fix

                      Function get_dependencies has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
                      Open

                      def get_dependencies(dirname, include_uri=False, include_extras=True, include_specs=True, include_markers=True):
                          """ Parse required and optional dependencies from requirements.txt files
                      
                          Args:
                              dirname (:obj:`str`): path to the package
                      Severity: Minor
                      Found in pkg_utils/core.py - About 35 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                              if req.subdirectory:
                                  dependency_link += '&subdirectory=' + req.subdirectory
                                  uri_line = uri_line.replace(
                                      '&subdirectory=' + req.subdirectory, '')
                      Severity: Major
                      Found in pkg_utils/core.py and 1 other location - About 1 hr to fix
                      pkg_utils/core.py on lines 386..388

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 41.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                              if req.revision:
                                  dependency_link += '@' + req.revision
                                  uri_line = uri_line.replace('@' + req.revision, '')
                      Severity: Major
                      Found in pkg_utils/core.py and 1 other location - About 1 hr to fix
                      pkg_utils/core.py on lines 398..401

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 41.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                              for name, func in parser.items('console_scripts'):
                                  scripts[str(name)] = {
                                      'function': str(func),
                      Severity: Major
                      Found in pkg_utils/core.py and 1 other location - About 1 hr to fix
                      pkg_utils/core.py on lines 495..497

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 38.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                          for name, func in parser.items('console_scripts'):
                              console_scripts[str(name)] = {
                                  'function': str(func),
                      Severity: Major
                      Found in pkg_utils/core.py and 1 other location - About 1 hr to fix
                      pkg_utils/core.py on lines 472..474

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 38.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 4 locations. Consider refactoring.
                      Open

                          docs_require, tmp = parse_requirements_file(
                              os.path.join(dirname, 'docs/requirements.txt'),
                      Severity: Major
                      Found in pkg_utils/core.py and 3 other locations - About 40 mins to fix
                      pkg_utils/core.py on lines 172..173
                      pkg_utils/core.py on lines 178..179
                      pkg_utils/core.py on lines 184..185

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 34.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 4 locations. Consider refactoring.
                      Open

                          extras_require, tmp = parse_optional_requirements_file(
                              os.path.join(dirname, 'requirements.optional.txt'),
                      Severity: Major
                      Found in pkg_utils/core.py and 3 other locations - About 40 mins to fix
                      pkg_utils/core.py on lines 172..173
                      pkg_utils/core.py on lines 184..185
                      pkg_utils/core.py on lines 190..191

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 34.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 4 locations. Consider refactoring.
                      Open

                          tests_require, tmp = parse_requirements_file(
                              os.path.join(dirname, 'tests/requirements.txt'),
                      Severity: Major
                      Found in pkg_utils/core.py and 3 other locations - About 40 mins to fix
                      pkg_utils/core.py on lines 172..173
                      pkg_utils/core.py on lines 178..179
                      pkg_utils/core.py on lines 190..191

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 34.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 4 locations. Consider refactoring.
                      Open

                          install_requires, tmp = parse_requirements_file(
                              os.path.join(dirname, 'requirements.txt'),
                      Severity: Major
                      Found in pkg_utils/core.py and 3 other locations - About 40 mins to fix
                      pkg_utils/core.py on lines 178..179
                      pkg_utils/core.py on lines 184..185
                      pkg_utils/core.py on lines 190..191

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 34.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      There are no issues that match your filters.

                      Category
                      Status