gmantelet/python-lyth

View on GitHub
src/lyth/compiler/token.py

Summary

Maintainability
C
7 hrs
Test Coverage

Function __add__ has a Cognitive Complexity of 22 (exceeds 5 allowed). Consider refactoring.
Open

    def __add__(self, lexeme: str) -> Token:
        """
        Add a scanned character to an existing token.

        This method validates that the character appended to the existing token
Severity: Minor
Found in src/lyth/compiler/token.py - About 3 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Consider simplifying this complex logical expression.
Open

        if symbol is not None and self.symbol in Literal:
            raise LythSyntaxError(self.info, msg=LythError.MISSING_SPACE_BEFORE_OPERATOR)

        elif lexeme.isdigit() and self.symbol in Literal:
            self.lexeme += lexeme
Severity: Critical
Found in src/lyth/compiler/token.py - About 2 hrs to fix

    Avoid too many return statements within this function.
    Open

                return self
    Severity: Major
    Found in src/lyth/compiler/token.py - About 30 mins to fix

      Avoid too many return statements within this function.
      Open

                  return self
      Severity: Major
      Found in src/lyth/compiler/token.py - About 30 mins to fix

        Avoid too many return statements within this function.
        Open

                    return self
        Severity: Major
        Found in src/lyth/compiler/token.py - About 30 mins to fix

          Function __init__ has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
          Open

              def __init__(self, lexeme: str, scan: Scanner, force_literal=False) -> None:
                  """
                  Instantiate a new Token.
          
                  Instantiates a new Token object if the provided symbol is a _Lexeme. If
          Severity: Minor
          Found in src/lyth/compiler/token.py - About 25 mins to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Refactor this function to reduce its Cognitive Complexity from 24 to the 15 allowed.
          Open

              def __add__(self, lexeme: str) -> Token:
          Severity: Critical
          Found in src/lyth/compiler/token.py by sonar-python

          Cognitive Complexity is a measure of how hard the control flow of a function is to understand. Functions with high Cognitive Complexity will be difficult to maintain.

          See

          There are no issues that match your filters.

          Category
          Status