python-security/pyt

View on GitHub

Showing 158 of 158 total issues

Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
Open

    def __init__(self, label, ast_node, *, line_number=None, path):
Severity: Minor
Found in pyt/core/node_types.py - About 35 mins to fix

    Function identify_triggers has 5 arguments (exceeds 4 allowed). Consider refactoring.
    Open

    def identify_triggers(
    Severity: Minor
    Found in pyt/vulnerabilities/vulnerabilities.py - About 35 mins to fix

      Function from_directory_import has 5 arguments (exceeds 4 allowed). Consider refactoring.
      Open

          def from_directory_import(
      Severity: Minor
      Found in pyt/cfg/stmt_visitor.py - About 35 mins to fix

        Identical blocks of code found in 2 locations. Consider refactoring.
        Open

                        'Definitions: "' + '", "'
                        .join([str(definition) for definition in self.definitions]) +
        Severity: Minor
        Found in pyt/core/module_definitions.py and 1 other location - About 35 mins to fix
        pyt/core/module_definitions.py on lines 120..121

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 33.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Function _get_inner_most_function_call has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
        Open

        def _get_inner_most_function_call(call_node):
            # Loop to inner most function call
            # e.g. return scrypt.inner in `foo = scrypt.outer(scrypt.inner(image_name))`
            old_call_node = None
            while call_node != old_call_node:
        Severity: Minor
        Found in pyt/cfg/stmt_visitor_helper.py - About 35 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function get_func_cfg_with_tainted_args has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
        Open

            def get_func_cfg_with_tainted_args(self, definition):
                """Build a function cfg and return it, with all arguments tainted."""
                log.debug("Getting CFG for %s", definition.name)
                func_cfg = make_cfg(
                    definition.node,
        Severity: Minor
        Found in pyt/web_frameworks/framework_adaptor.py - About 35 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Identical blocks of code found in 2 locations. Consider refactoring.
        Open

                for target, value in zip(reversed(list(remaining_targets)), reversed(list(remaining_values))):
                    if isinstance(target, ast.Starred) or isinstance(value, ast.Starred):
                        break
                    visit(target, value)
        Severity: Minor
        Found in pyt/cfg/stmt_visitor.py and 1 other location - About 35 mins to fix
        pyt/cfg/stmt_visitor.py on lines 368..371

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 33.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Identical blocks of code found in 2 locations. Consider refactoring.
        Open

                for target, value in zip(target_nodes, value_nodes):
                    if isinstance(target, ast.Starred) or isinstance(value, ast.Starred):
                        break
                    visit(target, value)
        Severity: Minor
        Found in pyt/cfg/stmt_visitor.py and 1 other location - About 35 mins to fix
        pyt/cfg/stmt_visitor.py on lines 374..377

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 33.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Function report has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
        Open

        def report(
            vulnerabilities,
            fileobj,
            print_sanitised,
        ):
        Severity: Minor
        Found in pyt/formatters/screen.py - About 35 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function __str__ has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
        Open

            def __str__(self):
                module = 'NoModuleName'
                if self.module_name:
                    module = self.module_name
        
        
        Severity: Minor
        Found in pyt/core/module_definitions.py - About 35 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function update_assignments has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
        Open

        def update_assignments(
            assignment_list,
            assignment_nodes,
            source,
            lattice
        Severity: Minor
        Found in pyt/vulnerabilities/vulnerabilities.py - About 35 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function get_directory_modules has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
        Open

        def get_directory_modules(directory):
            """Return a list containing tuples of
            e.g. ('__init__', 'example/import_test_project/__init__.py')
            """
            if _local_modules and os.path.dirname(_local_modules[0][1]) == directory:
        Severity: Minor
        Found in pyt/core/project_handler.py - About 35 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Identical blocks of code found in 2 locations. Consider refactoring.
        Open

                    def_arg_temp_name = 'temp_' + str(saved_function_call_index) + '_' + def_args[i]
        Severity: Minor
        Found in pyt/cfg/expr_visitor.py and 1 other location - About 35 mins to fix
        pyt/cfg/expr_visitor.py on lines 352..352

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 33.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Identical blocks of code found in 2 locations. Consider refactoring.
        Open

                            'Definitions: "' + '", "'
                            .join([str(definition) for definition in self.definitions]) +
        Severity: Minor
        Found in pyt/core/module_definitions.py and 1 other location - About 35 mins to fix
        pyt/core/module_definitions.py on lines 126..127

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 33.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Function connect_nodes has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
        Open

        def connect_nodes(nodes):
            """Connect the nodes in a list linearly."""
            for n, next_node in zip(nodes, nodes[1:]):
                if isinstance(n, ControlFlowNode):
                    _connect_control_flow_node(n, next_node)
        Severity: Minor
        Found in pyt/cfg/stmt_visitor_helper.py - About 35 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Identical blocks of code found in 2 locations. Consider refactoring.
        Open

                    def_arg_temp_name = 'temp_' + str(saved_function_call_index) + '_' + def_args[i]
        Severity: Minor
        Found in pyt/cfg/expr_visitor.py and 1 other location - About 35 mins to fix
        pyt/cfg/expr_visitor.py on lines 274..274

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 33.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Avoid too many return statements within this function.
        Open

                return IgnoredNode()
        Severity: Major
        Found in pyt/cfg/stmt_visitor.py - About 30 mins to fix

          Avoid too many return statements within this function.
          Open

                          return self.add_module(
          Severity: Major
          Found in pyt/cfg/stmt_visitor.py - About 30 mins to fix

            Avoid too many return statements within this function.
            Open

                            return self.append_node(AssignmentNode(
            Severity: Major
            Found in pyt/cfg/stmt_visitor.py - About 30 mins to fix

              Avoid too many return statements within this function.
              Open

                          return self.assign_multi_target(node, rhs_visitor.result)
              Severity: Major
              Found in pyt/cfg/stmt_visitor.py - About 30 mins to fix
                Severity
                Category
                Status
                Source
                Language