cea-sec/miasm

View on GitHub
miasm/core/graph.py

Summary

Maintainability
F
1 wk
Test Coverage

File graph.py has 864 lines of code (exceeds 250 allowed). Consider refactoring.
Open

from collections import defaultdict, namedtuple

from future.utils import viewitems, viewvalues
import re

Severity: Major
Found in miasm/core/graph.py - About 2 days to fix

    DiGraph has 61 functions (exceeds 20 allowed). Consider refactoring.
    Open

    class DiGraph(object):
    
        """Implementation of directed graph"""
    
        # Stand for a cell in a dot node rendering
    Severity: Major
    Found in miasm/core/graph.py - About 1 day to fix

      Function compute_strongly_connected_components has a Cognitive Complexity of 36 (exceeds 5 allowed). Consider refactoring.
      Open

          def compute_strongly_connected_components(self):
              """
              Partitions the graph into strongly connected components.
      
              Iterative implementation of Gabow's path-based SCC algorithm.
      Severity: Minor
      Found in miasm/core/graph.py - About 5 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function compute_dominance_frontier has a Cognitive Complexity of 19 (exceeds 5 allowed). Consider refactoring.
      Open

          def compute_dominance_frontier(self, head):
              """
              Compute the dominance frontier of the graph
      
              Source: Cooper, Keith D., Timothy J. Harvey, and Ken Kennedy.
      Severity: Minor
      Found in miasm/core/graph.py - About 2 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _check_node has a Cognitive Complexity of 19 (exceeds 5 allowed). Consider refactoring.
      Open

          def _check_node(self, candidate, expected, graph, partial_sol=None):
              """Check if @candidate can stand for @expected in @graph, given @partial_sol
              @candidate: @graph's node
              @expected: MatchGraphJoker instance
              @graph: DiGraph instance
      Severity: Minor
      Found in miasm/core/graph.py - About 2 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _compute_generic_dominators has a Cognitive Complexity of 16 (exceeds 5 allowed). Consider refactoring.
      Open

          def _compute_generic_dominators(head, reachable_cb, prev_cb, next_cb):
              """Generic algorithm to compute either the dominators or postdominators
              of the graph.
              @head: the head/leaf of the graph
              @reachable_cb: sons/parents of the head/leaf
      Severity: Minor
      Found in miasm/core/graph.py - About 2 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _walk_generic_dominator has a Cognitive Complexity of 13 (exceeds 5 allowed). Consider refactoring.
      Open

          def _walk_generic_dominator(node, gen_dominators, succ_cb):
              """Generic algorithm to return an iterator of the ordered list of
              @node's dominators/post_dominator.
      
              The function doesn't return the self reference in dominators.
      Severity: Minor
      Found in miasm/core/graph.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function match has a Cognitive Complexity of 13 (exceeds 5 allowed). Consider refactoring.
      Open

          def match(self, graph):
              """Naive subgraph matching between graph and self.
              Iterator on matching solution, as dictionary MatchGraphJoker -> @graph
              @graph: DiGraph instance
              In order to obtained correct and complete results, @graph must be
      Severity: Minor
      Found in miasm/core/graph.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function has_loop has a Cognitive Complexity of 13 (exceeds 5 allowed). Consider refactoring.
      Open

          def has_loop(self):
              """Return True if the graph contains at least a cycle"""
              todo = list(self.nodes())
              # tested nodes
              done = set()
      Severity: Minor
      Found in miasm/core/graph.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _propagate_sol has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
      Open

          def _propagate_sol(self, node, partial_sol, graph, todo, propagator):
              """
              Try to extend the current @partial_sol by propagating the solution using
              @propagator on @node.
              New solutions are added to @todo
      Severity: Minor
      Found in miasm/core/graph.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function graphviz has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
      Open

          def graphviz(self):
              try:
                  import re
                  import graphviz
      
      
      Severity: Minor
      Found in miasm/core/graph.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function find_path has a Cognitive Complexity of 10 (exceeds 5 allowed). Consider refactoring.
      Open

          def find_path(self, src, dst, cycles_count=0, done=None):
              """
              Searches for paths from @src to @dst
              @src: loc_key of basic block from which it should start
              @dst: loc_key of basic block where it should stop
      Severity: Minor
      Found in miasm/core/graph.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function dot has a Cognitive Complexity of 10 (exceeds 5 allowed). Consider refactoring.
      Open

          def dot(self):
              """Render dot graph with HTML"""
      
              td_attr = {'align': 'left'}
              nodes_attr = {'shape': 'Mrecord',
      Severity: Minor
      Found in miasm/core/graph.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function find_path_from_src has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
      Open

          def find_path_from_src(self, src, dst, cycles_count=0, done=None):
              """
              This function does the same as function find_path.
              But it searches the paths from src to dst, not vice versa like find_path.
              This approach might be more efficient in some cases.
      Severity: Minor
      Found in miasm/core/graph.py - About 55 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Avoid deeply nested control flow statements.
      Open

                              while index[current.node] < boundaries[-1]:
                                  boundaries.pop()
      
                      # merge strongly connected component
                      else:
      Severity: Major
      Found in miasm/core/graph.py - About 45 mins to fix

        Avoid deeply nested control flow statements.
        Open

                                while index[current.node] <= len(stack):
                                    popped = stack.pop()
                                    index[popped] = counter
                                    scc.add(popped)
        
        
        Severity: Major
        Found in miasm/core/graph.py - About 45 mins to fix

          Avoid deeply nested control flow statements.
          Open

                                  if runner not in frontier:
                                      frontier[runner] = set()
          
          
          Severity: Major
          Found in miasm/core/graph.py - About 45 mins to fix

            Function _propagate_sol has 5 arguments (exceeds 4 allowed). Consider refactoring.
            Open

                def _propagate_sol(self, node, partial_sol, graph, todo, propagator):
            Severity: Minor
            Found in miasm/core/graph.py - About 35 mins to fix

              Function compute_immediate_dominators has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
              Open

                  def compute_immediate_dominators(self, head):
                      """Compute the immediate dominators of the graph"""
                      dominators = self.compute_dominators(head)
                      idoms = {}
              
              
              Severity: Minor
              Found in miasm/core/graph.py - About 35 mins to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              Avoid too many return statements within this function.
              Open

                              return False
              Severity: Major
              Found in miasm/core/graph.py - About 30 mins to fix

                Avoid too many return statements within this function.
                Open

                                return False
                Severity: Major
                Found in miasm/core/graph.py - About 30 mins to fix

                  Avoid too many return statements within this function.
                  Open

                          return True
                  Severity: Major
                  Found in miasm/core/graph.py - About 30 mins to fix

                    Avoid too many return statements within this function.
                    Open

                                return True
                    Severity: Major
                    Found in miasm/core/graph.py - About 30 mins to fix

                      Function compute_immediate_postdominators has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def compute_immediate_postdominators(self,tail):
                              """Compute the immediate postdominators of the graph"""
                              postdominators = self.compute_postdominators(tail)
                              ipdoms = {}
                      
                      
                      Severity: Minor
                      Found in miasm/core/graph.py - About 25 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function compute_back_edges has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def compute_back_edges(self, head):
                              """
                              Computes all back edges from a node to a
                              dominator in the graph.
                              :param head: head of graph
                      Severity: Minor
                      Found in miasm/core/graph.py - About 25 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function replace_node has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def replace_node(self, node, new_node):
                              """
                              Replace @node by @new_node
                              """
                      
                      
                      Severity: Minor
                      Found in miasm/core/graph.py - About 25 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function compute_weakly_connected_components has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def compute_weakly_connected_components(self):
                              """
                              Return the weakly connected components
                              """
                              remaining = set(self.nodes())
                      Severity: Minor
                      Found in miasm/core/graph.py - About 25 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                              if ((expected.restrict_in == True and
                                   len(self.predecessors(expected)) != len(graph.predecessors(candidate))) or
                                  (expected.restrict_in == False and
                                   len(self.predecessors(expected)) > len(graph.predecessors(candidate)))):
                                  return False
                      Severity: Major
                      Found in miasm/core/graph.py and 1 other location - About 5 hrs to fix
                      miasm/core/graph.py on lines 1025..1029

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 90.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                              if ((expected.restrict_out == True and
                                   len(self.successors(expected)) != len(graph.successors(candidate))) or
                                  (expected.restrict_out == False and
                                   len(self.successors(expected)) > len(graph.successors(candidate)))):
                                  return False
                      Severity: Major
                      Found in miasm/core/graph.py and 1 other location - About 5 hrs to fix
                      miasm/core/graph.py on lines 1020..1024

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 90.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                          def compute_immediate_dominators(self, head):
                              """Compute the immediate dominators of the graph"""
                              dominators = self.compute_dominators(head)
                              idoms = {}
                      
                      
                      Severity: Major
                      Found in miasm/core/graph.py and 1 other location - About 5 hrs to fix
                      miasm/core/graph.py on lines 564..574

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 87.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                          def compute_immediate_postdominators(self,tail):
                              """Compute the immediate postdominators of the graph"""
                              postdominators = self.compute_postdominators(tail)
                              ipdoms = {}
                      
                      
                      Severity: Major
                      Found in miasm/core/graph.py and 1 other location - About 5 hrs to fix
                      miasm/core/graph.py on lines 552..562

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 87.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Identical blocks of code found in 2 locations. Consider refactoring.
                      Open

                                      for lineDesc in elements:
                                          out_render = ""
                                          if isinstance(lineDesc, self.DotCellDescription):
                                              lineDesc = [lineDesc]
                                          for col in lineDesc:
                      Severity: Major
                      Found in miasm/core/graph.py and 1 other location - About 4 hrs to fix
                      miasm/core/graph.py on lines 267..275

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 76.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Identical blocks of code found in 2 locations. Consider refactoring.
                      Open

                                  for lineDesc in self.node2lines(node):
                                      out_render = ""
                                      if isinstance(lineDesc, self.DotCellDescription):
                                          lineDesc = [lineDesc]
                                      for col in lineDesc:
                      Severity: Major
                      Found in miasm/core/graph.py and 1 other location - About 4 hrs to fix
                      miasm/core/graph.py on lines 320..328

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 76.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                              for succ in self.successors(expected):
                                  if (succ in partial_sol and
                                          partial_sol[succ] not in graph.successors(candidate)):
                                      return False
                      Severity: Major
                      Found in miasm/core/graph.py and 1 other location - About 1 hr to fix
                      miasm/core/graph.py on lines 1034..1037

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 48.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                              for pred in self.predecessors(expected):
                                  if (pred in partial_sol and
                                          partial_sol[pred] not in graph.predecessors(candidate)):
                                      return False
                      Severity: Major
                      Found in miasm/core/graph.py and 1 other location - About 1 hr to fix
                      miasm/core/graph.py on lines 1039..1042

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 48.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                          def successors_iter(self, node):
                              if not node in self._nodes_succ:
                                  return
                              for n_suc in self._nodes_succ[node]:
                                  yield n_suc
                      Severity: Major
                      Found in miasm/core/graph.py and 1 other location - About 1 hr to fix
                      miasm/core/graph.py on lines 117..121

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 43.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                          def predecessors_iter(self, node):
                              if not node in self._nodes_pred:
                                  return
                              for n_pred in self._nodes_pred[node]:
                                  yield n_pred
                      Severity: Major
                      Found in miasm/core/graph.py and 1 other location - About 1 hr to fix
                      miasm/core/graph.py on lines 126..130

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 43.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                          def leaves_iter(self):
                              for node in self._nodes:
                                  if not self._nodes_succ[node]:
                                      yield node
                      Severity: Minor
                      Found in miasm/core/graph.py and 1 other location - About 45 mins to fix
                      miasm/core/graph.py on lines 143..146

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 35.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                          def heads_iter(self):
                              for node in self._nodes:
                                  if not self._nodes_pred[node]:
                                      yield node
                      Severity: Minor
                      Found in miasm/core/graph.py and 1 other location - About 45 mins to fix
                      miasm/core/graph.py on lines 135..138

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 35.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                              if src in done and done[src] > cycles_count:
                                  return [[]]
                      Severity: Minor
                      Found in miasm/core/graph.py and 1 other location - About 30 mins to fix
                      miasm/core/graph.py on lines 162..163

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 32.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                              if dst in done and done[dst] > cycles_count:
                                  return [[]]
                      Severity: Minor
                      Found in miasm/core/graph.py and 1 other location - About 30 mins to fix
                      miasm/core/graph.py on lines 191..192

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 32.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      There are no issues that match your filters.

                      Category
                      Status