cea-sec/miasm

View on GitHub

Showing 3,020 of 3,020 total issues

Similar blocks of code found in 2 locations. Consider refactoring.
Open

                    if dst_offset + dst_size <= int(dst_base.mask) + 1:
                        # @32[ESP + 0xFFFFFFFC] => [0xFFFFFFFC, 0xFFFFFFFF]
                        interval1 = interval([(dst_offset, dst_offset + dst.size // 8 - 1)])
                    else:
                        # @32[ESP + 0xFFFFFFFE] => [0x0, 0x1] U [0xFFFFFFFE, 0xFFFFFFFF]
Severity: Major
Found in miasm/analysis/data_flow.py and 1 other location - About 1 day to fix
miasm/analysis/data_flow.py on lines 1940..1946

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 129.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Similar blocks of code found in 2 locations. Consider refactoring.
Open

                    if src_offset + src_size <= int(src_base.mask) + 1:
                        # @32[ESP + 0xFFFFFFFC] => [0xFFFFFFFC, 0xFFFFFFFF]
                        interval2 = interval([(src_offset, src_offset + src.size // 8 - 1)])
                    else:
                        # @32[ESP + 0xFFFFFFFE] => [0x0, 0x1] U [0xFFFFFFFE, 0xFFFFFFFF]
Severity: Major
Found in miasm/analysis/data_flow.py and 1 other location - About 1 day to fix
miasm/analysis/data_flow.py on lines 1933..1939

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 129.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

File pe_init.py has 523 lines of code (exceeds 250 allowed). Consider refactoring.
Open

#! /usr/bin/env python

from __future__ import print_function

from builtins import range
Severity: Major
Found in miasm/loader/pe_init.py - About 1 day to fix

    File depgraph.py has 518 lines of code (exceeds 250 allowed). Consider refactoring.
    Open

    """Provide dependency graph"""
    
    from functools import total_ordering
    
    from future.utils import viewitems
    Severity: Major
    Found in miasm/analysis/depgraph.py - About 1 day to fix

      File ssa.py has 518 lines of code (exceeds 250 allowed). Consider refactoring.
      Open

      from collections import deque
      from future.utils import viewitems, viewvalues
      
      from miasm.expression.expression import ExprId, ExprAssign, ExprOp, \
          ExprLoc, get_expr_ids
      Severity: Major
      Found in miasm/analysis/ssa.py - About 1 day to fix

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def pcmpeq(_, instr, dst, src, size):
            e = []
            for i in range(0, dst.size, size):
                test = m2_expr.expr_is_equal(dst[i:i + size], src[i:i + size])
                e.append(m2_expr.ExprAssign(dst[i:i + size],
        Severity: Major
        Found in miasm/arch/x86/sem.py and 1 other location - About 1 day to fix
        miasm/arch/x86/sem.py on lines 4516..4524

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 127.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def pcmpgt(_, instr, dst, src, size):
            e = []
            for i in range(0, dst.size, size):
                test = m2_expr.expr_is_signed_greater(dst[i:i + size], src[i:i + size])
                e.append(m2_expr.ExprAssign(dst[i:i + size],
        Severity: Major
        Found in miasm/arch/x86/sem.py and 1 other location - About 1 day to fix
        miasm/arch/x86/sem.py on lines 4505..4513

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 127.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

                for entry in self.impdesc:
                    length += len(entry.dlldescname)
                    if entry.originalfirstthunk and self.parent_head.rva2off(entry.originalfirstthunk):
                        length += (len(entry.originalfirstthunks) + 1) * rva_size
                    if entry.firstthunk:
        Severity: Major
        Found in miasm/loader/pe.py and 1 other location - About 1 day to fix
        miasm/loader/pe.py on lines 935..943

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 126.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

                for entry in self.delaydesc:
                    length += len(entry.dlldescname)
                    if entry.originalfirstthunk and self.parent_head.rva2off(entry.originalfirstthunk):
                        length += (len(entry.originalfirstthunks) + 1) * rva_size
                    if entry.firstthunk:
        Severity: Major
        Found in miasm/loader/pe.py and 1 other location - About 1 day to fix
        miasm/loader/pe.py on lines 405..413

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 126.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Function merge has a Cognitive Complexity of 53 (exceeds 5 allowed). Consider refactoring.
        Open

            def merge(self, other):
                """
                Merge the current state with @other
                Merge rules:
                - if two nodes are equal in both states => in equivalence class
        Severity: Minor
        Found in miasm/analysis/data_flow.py - About 1 day to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def _saturation_sub_signed(expr):
            assert expr.is_op("+") and len(expr.args) == 2 and expr.args[-1].is_op("-")
        
            # Compute the subtraction on two more bits, see _saturation_sub_unsigned
            arg1 = expr.args[0].signExtend(expr.size + 2)
        Severity: Major
        Found in miasm/arch/x86/sem.py and 1 other location - About 1 day to fix
        miasm/arch/x86/sem.py on lines 4902..4909

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 125.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def _saturation_sub_unsigned(expr):
            assert expr.is_op("+") and len(expr.args) == 2 and expr.args[-1].is_op("-")
        
            # Compute the soustraction on one more bit to be able to distinguish cases:
            # 0x48 - 0xd7 in 8 bit, should saturate
        Severity: Major
        Found in miasm/arch/x86/sem.py and 1 other location - About 1 day to fix
        miasm/arch/x86/sem.py on lines 4911..4917

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 125.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def cvtpd2ps(_, instr, dst, src):
            e = []
            e.append(
                m2_expr.ExprAssign(dst[:32], m2_expr.ExprOp('fpconvert_fp32', src[:64])))
            e.append(
        Severity: Major
        Found in miasm/arch/x86/sem.py and 1 other location - About 1 day to fix
        miasm/arch/x86/sem.py on lines 4148..4155

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 125.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        def cvtpd2dq(_, instr, dst, src):
            e = []
            e.append(
                m2_expr.ExprAssign(dst[:32], m2_expr.ExprOp('fp_to_sint32', src[:64])))
            e.append(
        Severity: Major
        Found in miasm/arch/x86/sem.py and 1 other location - About 1 day to fix
        miasm/arch/x86/sem.py on lines 4167..4174

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 125.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        File arch.py has 509 lines of code (exceeds 250 allowed). Consider refactoring.
        Open

        #-*- coding:utf-8 -*-
        
        from builtins import range
        
        import logging
        Severity: Major
        Found in miasm/arch/msp430/arch.py - About 1 day to fix

          Function build_all has a Cognitive Complexity of 52 (exceeds 5 allowed). Consider refactoring.
          Open

          def build_all():
              packages=[
                  "miasm",
                  "miasm/arch",
                  "miasm/arch/x86",
          Severity: Minor
          Found in setup.py - About 1 day to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Similar blocks of code found in 2 locations. Consider refactoring.
          Open

              def __repr__(self):
                  rep = ["<%s>" % self.__class__.__name__]
                  for i, entry in enumerate(self.delaydesc):
                      out = "%2d %-25s %s" % (i, repr(entry.dlldescname), repr(entry))
                      rep.append(out)
          Severity: Major
          Found in miasm/loader/pe.py and 1 other location - About 1 day to fix
          miasm/loader/pe.py on lines 506..514

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 123.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Similar blocks of code found in 2 locations. Consider refactoring.
          Open

              def __repr__(self):
                  rep = ["<%s>" % self.__class__.__name__]
                  for i, entry in enumerate(self.impdesc):
                      out = "%2d %-25s %s" % (i, repr(entry.dlldescname), repr(entry))
                      rep.append(out)
          Severity: Major
          Found in miasm/loader/pe.py and 1 other location - About 1 day to fix
          miasm/loader/pe.py on lines 1007..1015

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 123.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Function build_graph has a Cognitive Complexity of 51 (exceeds 5 allowed). Consider refactoring.
          Open

          def build_graph(start_addr, type_graph, simplify=False, use_ida_stack=True, dontmodstack=False, loadint=False, verbose=False):
              machine = guess_machine(addr=start_addr)
              dis_engine, lifter_model_call = machine.dis_engine, machine.lifter_model_call
          
              class IRADelModCallStack(lifter_model_call):
          Severity: Minor
          Found in example/ida/graph_ir.py - About 7 hrs to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Function simp_slice has a Cognitive Complexity of 51 (exceeds 5 allowed). Consider refactoring.
          Open

          def simp_slice(e_s, expr):
              "Slice optimization"
          
              # slice(A, 0, a.size) => A
              if expr.start == 0 and expr.stop == expr.arg.size:
          Severity: Minor
          Found in miasm/expression/simplifications_common.py - About 7 hrs to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Severity
          Category
          Status
          Source
          Language