cea-sec/miasm

View on GitHub
miasm/analysis/data_flow.py

Summary

Maintainability
F
3 wks
Test Coverage

File data_flow.py has 1918 lines of code (exceeds 250 allowed). Consider refactoring.
Open

"""Data flow analysis based on miasm intermediate representation"""
from builtins import range
from collections import namedtuple, Counter
from pprint import pprint as pp
from future.utils import viewitems, viewvalues
Severity: Major
Found in miasm/analysis/data_flow.py - About 5 days to fix

    Function eval_assignblock has a Cognitive Complexity of 95 (exceeds 5 allowed). Consider refactoring.
    Open

        def eval_assignblock(self, assignblock):
            """
            Evaluate the @assignblock on the current state
            @assignblock: AssignBlock instance
            """
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 day to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function get_equivalence_class has a Cognitive Complexity of 57 (exceeds 5 allowed). Consider refactoring.
    Open

        def get_equivalence_class(self, node, ids_to_src):
            todo = set([node])
            done = set()
            defined = set()
            equivalence = set()
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 day to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function merge has a Cognitive Complexity of 53 (exceeds 5 allowed). Consider refactoring.
    Open

        def merge(self, other):
            """
            Merge the current state with @other
            Merge rules:
            - if two nodes are equal in both states => in equivalence class
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 day to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function load_from_int has a Cognitive Complexity of 41 (exceeds 5 allowed). Consider refactoring.
    Open

    def load_from_int(ircfg, bs, is_addr_ro_variable):
        """
        Replace memory read based on constant with static value
        @ircfg: IRCFG instance
        @bs: binstream instance
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 6 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function del_dummy_phi has a Cognitive Complexity of 36 (exceeds 5 allowed). Consider refactoring.
    Open

        def del_dummy_phi(self, ssa, head):
            ids_to_src = {}
            def_to_loc = {}
            for block in viewvalues(ssa.graph.blocks):
                for index, assignblock in enumerate(block):
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 5 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function src_gen_phi_node_srcs has a Cognitive Complexity of 33 (exceeds 5 allowed). Consider refactoring.
    Open

        def src_gen_phi_node_srcs(self, equivalence_graph):
            for node in equivalence_graph.nodes():
                if not node.is_op("Phi"):
                    continue
                phi_successors = equivalence_graph.successors(node)
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 4 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function may_interfer has a Cognitive Complexity of 27 (exceeds 5 allowed). Consider refactoring.
    Open

        def may_interfer(self, dsts, src):
            """
            Return True if @src may interfere with expressions in @dsts
            @dsts: Set of Expressions
            @src: expression to test
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 3 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function update_phi_with_deleted_edges has a Cognitive Complexity of 23 (exceeds 5 allowed). Consider refactoring.
    Open

    def update_phi_with_deleted_edges(ircfg, edges_to_del):
        """
        Update phi which have a source present in @edges_to_del
        @ssa: IRCFG instance in ssa form
        @edges_to_del: edges to delete
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 3 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function find_definitions_from_worklist has a Cognitive Complexity of 22 (exceeds 5 allowed). Consider refactoring.
    Open

        def find_definitions_from_worklist(self, worklist, ircfg):
            """
            Find variables definition in @worklist by browsing the @ircfg
            """
            locs_done = set()
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 3 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function del_unused_edges has a Cognitive Complexity of 19 (exceeds 5 allowed). Consider refactoring.
    Open

    def del_unused_edges(ircfg, heads):
        """
        Delete non accessible edges in the @ircfg graph.
        @ircfg: IRCFG instance in ssa form
        @heads: location of the heads of the graph
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function _compute_def_use_block has a Cognitive Complexity of 17 (exceeds 5 allowed). Consider refactoring.
    Open

        def _compute_def_use_block(self, block, reaching_defs, deref_mem=False, apply_simp=False):
            for index, assignblk in enumerate(block):
                assignblk_reaching_defs = reaching_defs.get_definitions(block.loc_key, index)
                for lval, expr in viewitems(assignblk):
                    self.add_node(AssignblkNode(block.loc_key, index, lval))
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function get_useful_assignments has a Cognitive Complexity of 17 (exceeds 5 allowed). Consider refactoring.
    Open

        def get_useful_assignments(self, ircfg, defuse, reaching_defs):
            """
            Mark useful statements using previous reach analysis and defuse
    
            Return a set of triplets (block, assignblk number, lvalue) of
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function retrieve_stack_accesses has a Cognitive Complexity of 16 (exceeds 5 allowed). Consider refactoring.
    Open

    def retrieve_stack_accesses(lifter, ircfg):
        """
        Walk the ssa graph and find stack based variables.
        Return a dictionary linking stack base address to its size/name
        @lifter: lifter_model_call instance
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function get_phi_sources has a Cognitive Complexity of 16 (exceeds 5 allowed). Consider refactoring.
    Open

    def get_phi_sources(phi_src, phi_dsts, ids_to_src):
        """
        Return False if the @phi_src has more than one non-phi source
        Else, return its source
        @ids_to_src: Dictionary linking phi source to its definition
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function merge_blocks has a Cognitive Complexity of 15 (exceeds 5 allowed). Consider refactoring.
    Open

    def merge_blocks(ircfg, heads):
        """
        This function modifies @ircfg to apply the following transformations:
        - group an irblock with its son if the irblock has one and only one son and
          this son has one and only one parent (spaghetti code).
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function propagate has a Cognitive Complexity of 15 (exceeds 5 allowed). Consider refactoring.
    Open

        def propagate(self, ssa, head, max_expr_depth=None):
            """
            Apply algorithm on the @ssa graph
            """
            ircfg = ssa.ircfg
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function get_unreachable_nodes has a Cognitive Complexity of 14 (exceeds 5 allowed). Consider refactoring.
    Open

    def get_unreachable_nodes(ircfg, edges_to_del, heads):
        """
        Return the unreachable nodes starting from heads and the associated edges to
        be deleted.
    
    
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function merge_prev_states has a Cognitive Complexity of 14 (exceeds 5 allowed). Consider refactoring.
    Open

        def merge_prev_states(self, ircfg, states, loc_key):
            """
            Merge predecessors states of irblock at location @loc_key
            @ircfg: IRCfg instance
            @states: Dictionary linking locations to state
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function from_ssa has a Cognitive Complexity of 13 (exceeds 5 allowed). Consider refactoring.
    Open

        def from_ssa(cls, ssa):
            """
            Return a DefUse DiGraph from a SSA graph
            @ssa: SSADiGraph instance
            """
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function discard_phi_sources has a Cognitive Complexity of 13 (exceeds 5 allowed). Consider refactoring.
    Open

    def discard_phi_sources(ircfg, deleted_vars):
        """
        Remove phi sources in @ircfg belonging to @deleted_vars set
        @ircfg: IRCFG instance in ssa form
        @deleted_vars: unused phi sources
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function replace_stack_vars has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
    Open

    def replace_stack_vars(lifter, ircfg):
        """
        Try to replace stack based memory accesses by variables.
    
        Hypothesis: the input ircfg must have all it's accesses to stack explicitly
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function __init__ has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
    Open

        def __init__(self, ircfg):
            super(DiGraphLivenessSSA, self).__init__(ircfg)
    
            self.loc_key_to_phi_parents = {}
            for irblock in viewvalues(self.blocks):
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function _do_merge_blocks has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
    Open

    def _do_merge_blocks(ircfg, loc_key, son_loc_key):
        """
        Merge two irblocks at @loc_key and @son_loc_key
    
        @ircfg: DiGrpahIR
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function back_propagate_to_parent has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
    Open

        def back_propagate_to_parent(self, todo, node, parent):
            if parent not in self.blocks:
                return
            parent_block = self.blocks[parent]
            cur_block = self.blocks[node]
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function do_dead_removal has a Cognitive Complexity of 10 (exceeds 5 allowed). Consider refactoring.
    Open

        def do_dead_removal(self, ircfg):
            """
            Remove useless assignments.
    
            This function is used to analyse relation of a * complete function *
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function link_nodes has a Cognitive Complexity of 10 (exceeds 5 allowed). Consider refactoring.
    Open

        def link_nodes(self, expr, *args, **kwargs):
            """
            Transform an Expression @expr into a tree and add link nodes to the
            current tree
            @expr: Expression
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function remove_empty_assignblks has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
    Open

    def remove_empty_assignblks(ircfg):
        """
        Remove empty assignblks in irblocks of @ircfg
        Return True if at least an irblock has been modified
    
    
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 55 mins to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function _test_jmp_only has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
    Open

    def _test_jmp_only(ircfg, loc_key, heads):
        """
        If irblock at @loc_key sets only IRDst to an ExprLoc, return the
        corresponding loc_key target.
        Avoid creating predecssors for heads LocKeys
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 45 mins to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function node2lines has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
    Open

        def node2lines(self, node):
            """
            Output liveness information in dot format
            """
            names = self.loc_db.get_location_names(node)
    Severity: Minor
    Found in miasm/analysis/data_flow.py - About 45 mins to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Avoid deeply nested control flow statements.
    Open

                        for mem in mems:
                            value = read_mem(bs, mem)
                            replace[mem] = value
                        src_new = src.replace_expr(replace)
    Severity: Major
    Found in miasm/analysis/data_flow.py - About 45 mins to fix

      Avoid deeply nested control flow statements.
      Open

                              if predecessor not in todo:
                                  todo.append(predecessor)
                          continue
      Severity: Major
      Found in miasm/analysis/data_flow.py - About 45 mins to fix

        Avoid deeply nested control flow statements.
        Open

                                if parent not in known:
                                    is_ok = False
                                    break
                            if not is_ok:
        Severity: Major
        Found in miasm/analysis/data_flow.py - About 45 mins to fix

          Avoid deeply nested control flow statements.
          Open

                              if src_new != src:
                                  modified = True
                          # Test dst pointer if dst is mem
                          if dst.is_mem():
          Severity: Major
          Found in miasm/analysis/data_flow.py - About 45 mins to fix

            Avoid deeply nested control flow statements.
            Open

                                    for predecessor in graph.predecessors(node):
                                        if predecessor not in todo:
                                            todo.append(predecessor)
                                    continue
            Severity: Major
            Found in miasm/analysis/data_flow.py - About 45 mins to fix

              Avoid deeply nested control flow statements.
              Open

                                  if mems:
                                      replace = {}
                                      for mem in mems:
                                          value = read_mem(bs, mem)
                                          replace[mem] = value
              Severity: Major
              Found in miasm/analysis/data_flow.py - About 45 mins to fix

                Avoid deeply nested control flow statements.
                Open

                                        if parent not in known:
                                            is_ok = False
                                            break
                                    if not is_ok:
                Severity: Major
                Found in miasm/analysis/data_flow.py - About 45 mins to fix

                  Avoid deeply nested control flow statements.
                  Open

                                          if old_dst in defined:
                                              continue
                                          fixed_phis[old_dst] = old_phi_src
                  Severity: Major
                  Found in miasm/analysis/data_flow.py - About 45 mins to fix

                    Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
                    Open

                        def __init__(self, reaching_defs,
                    Severity: Minor
                    Found in miasm/analysis/data_flow.py - About 35 mins to fix

                      Function process_block has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def process_block(self, block):
                              """
                              Fetch reach definitions from predecessors and propagate it to
                              the assignblk in block @block.
                              """
                      Severity: Minor
                      Found in miasm/analysis/data_flow.py - About 35 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function __init__ has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def __init__(self, ircfg):
                              super(DiGraphLiveness, self).__init__()
                              self.ircfg = ircfg
                              self.loc_db = ircfg.loc_db
                              self._blocks = {}
                      Severity: Minor
                      Found in miasm/analysis/data_flow.py - About 35 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function compute_liveness has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def compute_liveness(self):
                              """
                              Compute the liveness information for the digraph.
                              """
                              todo = set(self.leaves())
                      Severity: Minor
                      Found in miasm/analysis/data_flow.py - About 35 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function get_master has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def get_master(self, node):
                              """
                              Return the representative element of the equivalence class containing
                              @node
                              @node: ExprMem or ExprId
                      Severity: Minor
                      Found in miasm/analysis/data_flow.py - About 35 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function replace_node has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def replace_node(self, old_node, new_node):
                              """
                              Replace the @old_node by the @new_node
                              """
                              classes = self.get_classes()
                      Severity: Minor
                      Found in miasm/analysis/data_flow.py - About 35 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Avoid too many return statements within this function.
                      Open

                          return dst
                      Severity: Major
                      Found in miasm/analysis/data_flow.py - About 30 mins to fix

                        Avoid too many return statements within this function.
                        Open

                            return son
                        Severity: Major
                        Found in miasm/analysis/data_flow.py - About 30 mins to fix

                          Avoid too many return statements within this function.
                          Open

                                      return None
                          Severity: Major
                          Found in miasm/analysis/data_flow.py - About 30 mins to fix

                            Avoid too many return statements within this function.
                            Open

                                    return None
                            Severity: Major
                            Found in miasm/analysis/data_flow.py - About 30 mins to fix

                              Avoid too many return statements within this function.
                              Open

                                  return true_value
                              Severity: Major
                              Found in miasm/analysis/data_flow.py - About 30 mins to fix

                                Function get_block_useful_destinations has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                                Open

                                    def get_block_useful_destinations(self, block):
                                        """
                                        Force keeping of specific cases
                                        block: IRBlock instance
                                        """
                                Severity: Minor
                                Found in miasm/analysis/data_flow.py - About 25 mins to fix

                                Cognitive Complexity

                                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                                A method's cognitive complexity is based on a few simple rules:

                                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                                • Code is considered more complex for each "break in the linear flow of the code"
                                • Code is considered more complex when "flow breaking structures are nested"

                                Further reading

                                Function add_equivalence has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                                Open

                                    def add_equivalence(self, node_a, node_b):
                                        """
                                        Add the new equivalence @node_a == @node_b
                                        @node_a is equivalent to @node_b, but @node_b is more representative
                                        than @node_a
                                Severity: Minor
                                Found in miasm/analysis/data_flow.py - About 25 mins to fix

                                Cognitive Complexity

                                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                                A method's cognitive complexity is based on a few simple rules:

                                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                                • Code is considered more complex for each "break in the linear flow of the code"
                                • Code is considered more complex when "flow breaking structures are nested"

                                Further reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                                    if dst_offset + dst_size <= int(dst_base.mask) + 1:
                                                        # @32[ESP + 0xFFFFFFFC] => [0xFFFFFFFC, 0xFFFFFFFF]
                                                        interval1 = interval([(dst_offset, dst_offset + dst.size // 8 - 1)])
                                                    else:
                                                        # @32[ESP + 0xFFFFFFFE] => [0x0, 0x1] U [0xFFFFFFFE, 0xFFFFFFFF]
                                Severity: Major
                                Found in miasm/analysis/data_flow.py and 1 other location - About 1 day to fix
                                miasm/analysis/data_flow.py on lines 1940..1946

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 129.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                                    if src_offset + src_size <= int(src_base.mask) + 1:
                                                        # @32[ESP + 0xFFFFFFFC] => [0xFFFFFFFC, 0xFFFFFFFF]
                                                        interval2 = interval([(src_offset, src_offset + src.size // 8 - 1)])
                                                    else:
                                                        # @32[ESP + 0xFFFFFFFE] => [0x0, 0x1] U [0xFFFFFFFE, 0xFFFFFFFF]
                                Severity: Major
                                Found in miasm/analysis/data_flow.py and 1 other location - About 1 day to fix
                                miasm/analysis/data_flow.py on lines 1933..1939

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 129.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 3 locations. Consider refactoring.
                                Open

                                            if self.may_interfer(dsts, node):
                                                # Interfere with known equivalence class
                                                self.equivalence_classes.del_element(node)
                                                if node.is_id() or node.is_mem():
                                                    self.undefined.add(node)
                                Severity: Major
                                Found in miasm/analysis/data_flow.py and 2 other locations - About 1 hr to fix
                                miasm/analysis/data_flow.py on lines 2102..2105
                                miasm/analysis/data_flow.py on lines 2109..2112

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 40.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 3 locations. Consider refactoring.
                                Open

                                                if dst in self.equivalence_classes.nodes():
                                                    self.equivalence_classes.del_element(dst)
                                                    if dst.is_id() or dst.is_mem():
                                                        self.undefined.add(dst)
                                Severity: Major
                                Found in miasm/analysis/data_flow.py and 2 other locations - About 1 hr to fix
                                miasm/analysis/data_flow.py on lines 2087..2091
                                miasm/analysis/data_flow.py on lines 2102..2105

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 40.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 3 locations. Consider refactoring.
                                Open

                                            for node in to_del:
                                                self.equivalence_classes.del_element(node)
                                                if node.is_id() or node.is_mem():
                                                    self.undefined.add(node)
                                Severity: Major
                                Found in miasm/analysis/data_flow.py and 2 other locations - About 1 hr to fix
                                miasm/analysis/data_flow.py on lines 2087..2091
                                miasm/analysis/data_flow.py on lines 2109..2112

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 40.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Identical blocks of code found in 2 locations. Consider refactoring.
                                Open

                                        for node in known_class:
                                            if self.order[node] < self.order[best_node]:
                                                best_node = node
                                Severity: Minor
                                Found in miasm/analysis/data_flow.py and 1 other location - About 55 mins to fix
                                miasm/analysis/data_flow.py on lines 1780..1782

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 37.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Identical blocks of code found in 2 locations. Consider refactoring.
                                Open

                                        for node in known_class:
                                            if self.order[node] < self.order[best_node]:
                                                best_node = node
                                Severity: Minor
                                Found in miasm/analysis/data_flow.py and 1 other location - About 55 mins to fix
                                miasm/analysis/data_flow.py on lines 1836..1838

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 37.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                        elif node_a in self.node_to_class and node_b not in self.node_to_class:
                                            known_class = self.node_to_class[node_a]
                                            known_class.add(node_b)
                                            self.node_to_class[node_b] = known_class
                                Severity: Minor
                                Found in miasm/analysis/data_flow.py and 1 other location - About 50 mins to fix
                                miasm/analysis/data_flow.py on lines 1768..1771

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 36.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                        elif node_a not in self.node_to_class and node_b in self.node_to_class:
                                            known_class = self.node_to_class[node_b]
                                            known_class.add(node_a)
                                            self.node_to_class[node_a] = known_class
                                Severity: Minor
                                Found in miasm/analysis/data_flow.py and 1 other location - About 50 mins to fix
                                miasm/analysis/data_flow.py on lines 1764..1767

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 36.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Identical blocks of code found in 2 locations. Consider refactoring.
                                Open

                                                    if node.is_id() or node.is_mem():
                                                        assert(node not in nodes_ok)
                                                        undefined.add(node)
                                Severity: Minor
                                Found in miasm/analysis/data_flow.py and 1 other location - About 45 mins to fix
                                miasm/analysis/data_flow.py on lines 2187..2190

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 35.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 4 locations. Consider refactoring.
                                Open

                                        out.append("\tKill:" + ", ".join(str(x) for x in self.kill))
                                Severity: Major
                                Found in miasm/analysis/data_flow.py and 3 other locations - About 45 mins to fix
                                miasm/analysis/data_flow.py on lines 1012..1012
                                miasm/analysis/data_flow.py on lines 1013..1013
                                miasm/analysis/data_flow.py on lines 1021..1021

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 35.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 4 locations. Consider refactoring.
                                Open

                                        out.append("\tVarOut:" + ", ".join(str(x) for x in self.var_out))
                                Severity: Major
                                Found in miasm/analysis/data_flow.py and 3 other locations - About 45 mins to fix
                                miasm/analysis/data_flow.py on lines 1012..1012
                                miasm/analysis/data_flow.py on lines 1013..1013
                                miasm/analysis/data_flow.py on lines 1014..1014

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 35.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 4 locations. Consider refactoring.
                                Open

                                        out.append("\tGen:" + ", ".join(str(x) for x in self.gen))
                                Severity: Major
                                Found in miasm/analysis/data_flow.py and 3 other locations - About 45 mins to fix
                                miasm/analysis/data_flow.py on lines 1012..1012
                                miasm/analysis/data_flow.py on lines 1014..1014
                                miasm/analysis/data_flow.py on lines 1021..1021

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 35.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Identical blocks of code found in 2 locations. Consider refactoring.
                                Open

                                            for node in component:
                                                if node.is_id() or node.is_mem():
                                                    assert(node not in nodes_ok)
                                                    undefined.add(node)
                                Severity: Minor
                                Found in miasm/analysis/data_flow.py and 1 other location - About 45 mins to fix
                                miasm/analysis/data_flow.py on lines 2159..2161

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 35.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 4 locations. Consider refactoring.
                                Open

                                        out.append("\tVarIn:" + ", ".join(str(x) for x in self.var_in))
                                Severity: Major
                                Found in miasm/analysis/data_flow.py and 3 other locations - About 45 mins to fix
                                miasm/analysis/data_flow.py on lines 1013..1013
                                miasm/analysis/data_flow.py on lines 1014..1014
                                miasm/analysis/data_flow.py on lines 1021..1021

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 35.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                        if son is not None and loc_key not in heads:
                                            ret = _remove_to_son(ircfg, loc_key, son)
                                            modified |= ret
                                            if ret:
                                                todo.add(loc_key)
                                Severity: Minor
                                Found in miasm/analysis/data_flow.py and 1 other location - About 35 mins to fix
                                miasm/analysis/data_flow.py on lines 629..637

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 33.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                Similar blocks of code found in 2 locations. Consider refactoring.
                                Open

                                        if (son is not None and
                                            son not in heads and
                                            son in ircfg.blocks):
                                            # jmp only test done previously
                                            ret = _remove_to_parent(ircfg, loc_key, son)
                                Severity: Minor
                                Found in miasm/analysis/data_flow.py and 1 other location - About 35 mins to fix
                                miasm/analysis/data_flow.py on lines 621..626

                                Duplicated Code

                                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                                Tuning

                                This issue has a mass of 33.

                                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                                Refactorings

                                Further Reading

                                There are no issues that match your filters.

                                Category
                                Status