LucaCappelletti94/tinycrawler

View on GitHub
tinycrawler/process/parser.py

Summary

Maintainability
A
45 mins
Test Coverage

Cyclomatic complexity is too high in method _url_parser. (8)
Open

    def _url_parser(self, page_url: str, soup: BeautifulSoup) -> Set[str]:
        urls = set()
        for a in soup.findAll("a", href=True):
            url = urljoin(page_url, a.get("href"))
            if (
Severity: Minor
Found in tinycrawler/process/parser.py by radon

Cyclomatic Complexity

Cyclomatic Complexity corresponds to the number of decisions a block of code contains plus 1. This number (also called McCabe number) is equal to the number of linearly independent paths through the code. This number can be used as a guide when testing conditional logic in blocks.

Radon analyzes the AST tree of a Python program to compute Cyclomatic Complexity. Statements have the following effects on Cyclomatic Complexity:

Construct Effect on CC Reasoning
if +1 An if statement is a single decision.
elif +1 The elif statement adds another decision.
else +0 The else statement does not cause a new decision. The decision is at the if.
for +1 There is a decision at the start of the loop.
while +1 There is a decision at the while statement.
except +1 Each except branch adds a new conditional path of execution.
finally +0 The finally block is unconditionally executed.
with +1 The with statement roughly corresponds to a try/except block (see PEP 343 for details).
assert +1 The assert statement internally roughly equals a conditional statement.
Comprehension +1 A list/set/dict comprehension of generator expression is equivalent to a for loop.
Boolean Operator +1 Every boolean operator (and, or) adds a decision point.

Source: http://radon.readthedocs.org/en/latest/intro.html

Function __init__ has 13 arguments (exceeds 4 allowed). Consider refactoring.
Invalid

    def __init__(self, process_spawn_event: Event, process_callback_event: Event, pages_number: Value, urls_number: Value, responses: Queue, urls: Urls, robots: Robots, file_parser: Callable[[str, BeautifulSoup, Log], None], url_validator: Callable[[str, Log], bool], statistics: Statistics, logger: Log, follow_robots_txt: bool, parser_library: str):
Severity: Major
Found in tinycrawler/process/parser.py - About 1 hr to fix

    Function _url_parser has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
    Open

        def _url_parser(self, page_url: str, soup: BeautifulSoup) -> Set[str]:
            urls = set()
            for a in soup.findAll("a", href=True):
                url = urljoin(page_url, a.get("href"))
                if (
    Severity: Minor
    Found in tinycrawler/process/parser.py - About 45 mins to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Method "__init__" has 14 parameters, which is greater than the 7 authorized.
    Invalid

        def __init__(self, process_spawn_event: Event, process_callback_event: Event, pages_number: Value, urls_number: Value, responses: Queue, urls: Urls, robots: Robots, file_parser: Callable[[str, BeautifulSoup, Log], None], url_validator: Callable[[str, Log], bool], statistics: Statistics, logger: Log, follow_robots_txt: bool, parser_library: str):
    Severity: Major
    Found in tinycrawler/process/parser.py by sonar-python

    A long parameter list can indicate that a new structure should be created to wrap the numerous parameters or that the function is doing too many things.

    Noncompliant Code Example

    With a maximum number of 4 parameters:

    def do_something(param1, param2, param3, param4, param5):
        ...
    

    Compliant Solution

    def do_something(param1, param2, param3, param4):
        ...
    

    Continuation line with same indent as next logical line
    Open

                    (not self._follow_robots_txt or self._robots.can_fetch(url))):
    Severity: Minor
    Found in tinycrawler/process/parser.py by pep8

    Continuation lines indentation.

    Continuation lines should align wrapped elements either vertically
    using Python's implicit line joining inside parentheses, brackets
    and braces, or using a hanging indent.
    
    When using a hanging indent these considerations should be applied:
    - there should be no arguments on the first line, and
    - further indentation should be used to clearly distinguish itself
      as a continuation line.
    
    Okay: a = (\n)
    E123: a = (\n    )
    
    Okay: a = (\n    42)
    E121: a = (\n   42)
    E122: a = (\n42)
    E123: a = (\n    42\n    )
    E124: a = (24,\n     42\n)
    E125: if (\n    b):\n    pass
    E126: a = (\n        42)
    E127: a = (24,\n      42)
    E128: a = (24,\n    42)
    E129: if (a or\n    b):\n    pass
    E131: a = (\n    42\n 24)

    Line too long (351 > 79 characters)
    Open

        def __init__(self, process_spawn_event: Event, process_callback_event: Event, pages_number: Value, urls_number: Value, responses: Queue, urls: Urls, robots: Robots, file_parser: Callable[[str, BeautifulSoup, Log], None], url_validator: Callable[[str, Log], bool], statistics: Statistics, logger: Log, follow_robots_txt: bool, parser_library: str):
    Severity: Minor
    Found in tinycrawler/process/parser.py by pep8

    Limit all lines to a maximum of 79 characters.

    There are still many devices around that are limited to 80 character
    lines; plus, limiting windows to 80 characters makes it possible to
    have several windows side-by-side.  The default wrapping on such
    devices looks ugly.  Therefore, please limit all lines to a maximum
    of 79 characters. For flowing long blocks of text (docstrings or
    comments), limiting the length to 72 characters is recommended.
    
    Reports error E501.

    Line too long (368 > 79 characters)
    Open

            self._urls, self._responses, self._process_callback_event, self._pages_number, self._urls_number, self._robots, self._follow_robots_txt, self._file_parser,  self._url_validator, self._logger, self._parser_library = urls, responses, process_callback_event, pages_number, urls_number, robots, follow_robots_txt, file_parser, url_validator, logger, parser_library
    Severity: Minor
    Found in tinycrawler/process/parser.py by pep8

    Limit all lines to a maximum of 79 characters.

    There are still many devices around that are limited to 80 character
    lines; plus, limiting windows to 80 characters makes it possible to
    have several windows side-by-side.  The default wrapping on such
    devices looks ugly.  Therefore, please limit all lines to a maximum
    of 79 characters. For flowing long blocks of text (docstrings or
    comments), limiting the length to 72 characters is recommended.
    
    Reports error E501.

    There are no issues that match your filters.

    Category
    Status