digidotcom/python-suitcase

View on GitHub

Showing 31 of 31 total issues

Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
Open

    def __init__(self, field, algo, start, end, **kwargs):
Severity: Minor
Found in suitcase/fields.py - About 35 mins to fix

    Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
    Open

        def __init__(self, length_field, get_length=None, set_length=None, multiplier=1, **kwargs):
    Severity: Minor
    Found in suitcase/fields.py - About 35 mins to fix

      Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
      Open

          def __init__(self, length_provider, dispatch_field,
      Severity: Minor
      Found in suitcase/fields.py - About 35 mins to fix

        Similar blocks of code found in 4 locations. Consider refactoring.
        Open

            @property
            def bytes_required(self):
                if self.length_provider is None:
                    return None
                else:
        Severity: Major
        Found in suitcase/fields.py and 3 other locations - About 35 mins to fix
        suitcase/fields.py on lines 426..431
        suitcase/fields.py on lines 943..948
        suitcase/fields.py on lines 950..955

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 33.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 4 locations. Consider refactoring.
        Open

            @property
            def num_elements(self):
                if self.num_elements_provider is None:
                    return None
                else:
        Severity: Major
        Found in suitcase/fields.py and 3 other locations - About 35 mins to fix
        suitcase/fields.py on lines 426..431
        suitcase/fields.py on lines 727..732
        suitcase/fields.py on lines 943..948

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 33.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 4 locations. Consider refactoring.
        Open

            @property
            def bytes_required(self):
                if self.length_provider is None:
                    return None
                else:
        Severity: Major
        Found in suitcase/fields.py and 3 other locations - About 35 mins to fix
        suitcase/fields.py on lines 426..431
        suitcase/fields.py on lines 727..732
        suitcase/fields.py on lines 950..955

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 33.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 4 locations. Consider refactoring.
        Open

            @property
            def bytes_required(self):
                if self.length_provider is None:
                    return None
                else:
        Severity: Major
        Found in suitcase/fields.py and 3 other locations - About 35 mins to fix
        suitcase/fields.py on lines 727..732
        suitcase/fields.py on lines 943..948
        suitcase/fields.py on lines 950..955

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 33.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Function pack has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
        Open

            def pack(self, stream):
                try:
                    keep_bytes = getattr(self, 'KEEP_BYTES', None)
                    if keep_bytes is not None:
                        if self.PACK_FORMAT[0] == b">"[0]:  # The element access makes this compatible with Python 2 and 3
        Severity: Minor
        Found in suitcase/fields.py - About 25 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function unpack has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
        Open

            def unpack(self, data, **kwargs):
                value = 0
                if self.UNPACK_FORMAT[0] == b">"[0]:  # The element access makes this compatible with Python 2 and 3
                    for i, byte in enumerate(reversed(struct.unpack(self.UNPACK_FORMAT, data))):
                        value |= (byte << (i * 8))
        Severity: Minor
        Found in suitcase/fields.py - About 25 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function __init__ has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
        Open

            def __init__(self, number_bits, field=None, **kwargs):
                BaseField.__init__(self, **kwargs)
                self._ordered_bitfields = []
                self._bitfield_map = {}
                if number_bits % 8 != 0:
        Severity: Minor
        Found in suitcase/fields.py - About 25 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function feed has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
        Open

            def feed(self, new_bytes):
                """Feed a new set of bytes into the protocol handler
        
                These bytes will be immediately fed into the parsing state machine and
                if new packets are found, the ``packet_callback`` will be executed
        Severity: Minor
        Found in suitcase/protocol.py - About 25 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Severity
        Category
        Status
        Source
        Language