AlexMathew/scrapple

View on GitHub

Showing 35 of 673 total issues

Function get_fields has a Cognitive Complexity of 10 (exceeds 5 allowed). Consider refactoring.
Open

def get_fields(config):
    """
    Recursive generator that yields the field names in the config file

    :param config: The configuration file that contains the specification of the extractor
Severity: Minor
Found in scrapple/utils/config.py - About 1 hr to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function extract_tabular has a Cognitive Complexity of 10 (exceeds 5 allowed). Consider refactoring.
Open

    def extract_tabular(self, header='', prefix='', suffix='', table_type='', *args, **kwargs):
        """
        Method for performing the tabular data extraction. \

        :param result: A dictionary containing the extracted data so far
Severity: Minor
Found in scrapple/selectors/selector.py - About 1 hr to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function extract_content has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
Open

    def extract_content(self, selector='', attr='', default='', connector='', *args, **kwargs):
        """
        Method for performing the content extraction for the particular selector type. \

        If the selector is "url", the URL of the current web page is returned.
Severity: Minor
Found in scrapple/selectors/selector.py - About 55 mins to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Avoid deeply nested control flow statements.
Open

                            if not th in tabular_data_headers:
                                tabular_data_headers[th] = len(tabular_data_headers)
            if not self.config['scraping'].get('next'):
Severity: Major
Found in scrapple/commands/run.py - About 45 mins to fix

Function extract_content has 6 arguments (exceeds 4 allowed). Consider refactoring.
Open

    def extract_content(self, selector='', attr='', default='', connector='', *args, **kwargs):
Severity: Minor
Found in scrapple/selectors/selector.py - About 45 mins to fix

Avoid deeply nested control flow statements.
Open

                            if not th in tabular_data_headers:
                                tabular_data_headers[th] = len(tabular_data_headers)
        except KeyboardInterrupt:
Severity: Major
Found in scrapple/commands/run.py - About 45 mins to fix

Function extract_tabular has 6 arguments (exceeds 4 allowed). Consider refactoring.
Open

    def extract_tabular(self, header='', prefix='', suffix='', table_type='', *args, **kwargs):
Severity: Minor
Found in scrapple/selectors/selector.py - About 45 mins to fix

Function traverse_next has 5 arguments (exceeds 4 allowed). Consider refactoring.
Open

def traverse_next(page, nextx, results, tabular_data_headers=[], verbosity=0):
Severity: Minor
Found in scrapple/utils/config.py - About 35 mins to fix

Function form_to_json has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
Open

def form_to_json(form):
    """
    Takes the form from the POST request in the web interface, and generates the JSON config\
    file 

Severity: Minor
Found in scrapple/utils/form.py - About 25 mins to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Method "extract_columns" has 10 parameters, which is greater than the 7 authorized.
Open

    def extract_columns(self, result={}, selector='', table_headers=[], attr='', connector='', default='', verbosity=0, *args, **kwargs):
Severity: Major
Found in scrapple/selectors/selector.py by sonar-python

A long parameter list can indicate that a new structure should be created to wrap the numerous parameters or that the function is doing too many things.

Noncompliant Code Example

With a maximum number of 4 parameters:

def do_something(param1, param2, param3, param4, param5):
    ...

Compliant Solution

def do_something(param1, param2, param3, param4):
    ...

Refactor this function to reduce its Cognitive Complexity from 25 to the 15 allowed.
Open

    def extract_columns(self, result={}, selector='', table_headers=[], attr='', connector='', default='', verbosity=0, *args, **kwargs):
Severity: Critical
Found in scrapple/selectors/selector.py by sonar-python

Cognitive Complexity is a measure of how hard the control flow of a function is to understand. Functions with high Cognitive Complexity will be difficult to maintain.

See

Refactor this function to reduce its Cognitive Complexity from 44 to the 15 allowed.
Open

    def run(self):
Severity: Critical
Found in scrapple/commands/run.py by sonar-python

Cognitive Complexity is a measure of how hard the control flow of a function is to understand. Functions with high Cognitive Complexity will be difficult to maintain.

See

Refactor this function to reduce its Cognitive Complexity from 16 to the 15 allowed.
Open

    def extract_rows(self, result={}, selector='', table_headers=[], attr='', connector='', default='', verbosity=0, *args, **kwargs):
Severity: Critical
Found in scrapple/selectors/selector.py by sonar-python

Cognitive Complexity is a measure of how hard the control flow of a function is to understand. Functions with high Cognitive Complexity will be difficult to maintain.

See

Refactor this function to reduce its Cognitive Complexity from 31 to the 15 allowed.
Open

def traverse_next(page, nextx, results, tabular_data_headers=[], verbosity=0):
Severity: Critical
Found in scrapple/utils/config.py by sonar-python

Cognitive Complexity is a measure of how hard the control flow of a function is to understand. Functions with high Cognitive Complexity will be difficult to maintain.

See

Method "extract_rows" has 10 parameters, which is greater than the 7 authorized.
Open

    def extract_rows(self, result={}, selector='', table_headers=[], attr='', connector='', default='', verbosity=0, *args, **kwargs):
Severity: Major
Found in scrapple/selectors/selector.py by sonar-python

A long parameter list can indicate that a new structure should be created to wrap the numerous parameters or that the function is doing too many things.

Noncompliant Code Example

With a maximum number of 4 parameters:

def do_something(param1, param2, param3, param4, param5):
    ...

Compliant Solution

def do_something(param1, param2, param3, param4):
    ...
Severity
Category
Status
Source
Language