Shoobx/pjpersist

View on GitHub

Showing 89 of 89 total issues

File datamanager.py has 675 lines of code (exceeds 250 allowed). Consider refactoring.
Open

##############################################################################
#
# Copyright (c) 2011 Zope Foundation and Contributors.
# Copyright (c) 2014 Shoobx, Inc.
# All Rights Reserved.
Severity: Major
Found in src/pjpersist/datamanager.py - About 1 day to fix

    File serialize.py has 505 lines of code (exceeds 250 allowed). Consider refactoring.
    Open

    ##############################################################################
    #
    # Copyright (c) 2011 Zope Foundation and Contributors.
    # Copyright (c) 2014 Shoobx, Inc.
    # All Rights Reserved.
    Severity: Major
    Found in src/pjpersist/serialize.py - About 1 day to fix

      File container.py has 465 lines of code (exceeds 250 allowed). Consider refactoring.
      Open

      ##############################################################################
      #
      # Copyright (c) 2011 Zope Foundation and Contributors.
      # Copyright (c) 2014 Shoobx, Inc.
      # All Rights Reserved.
      Severity: Minor
      Found in src/pjpersist/zope/container.py - About 7 hrs to fix

        Function get_object has a Cognitive Complexity of 44 (exceeds 5 allowed). Consider refactoring.
        Open

            def get_object(self, state, obj):
                # stateIsDict and state_py_type: optimization to avoid X lookups
                # the code was:
                # if isinstance(state, dict) and state.get('_py_type') == 'DBREF':
                # this methods gets called a gazillion times, so being fast is crucial
        Severity: Minor
        Found in src/pjpersist/serialize.py - About 6 hrs to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        PJDataManager has 42 functions (exceeds 20 allowed). Consider refactoring.
        Open

        class PJDataManager(object):
        
            root = None
        
            # Data manager is completely new. NOTE: It is important to leave this
        Severity: Minor
        Found in src/pjpersist/datamanager.py - About 5 hrs to fix

          Function get_state has a Cognitive Complexity of 35 (exceeds 5 allowed). Consider refactoring.
          Open

              def get_state(self, obj, pobj=None):
                  objectType = type(obj)
                  __traceback_info__ = obj, objectType, pobj
                  if objectType in interfaces.PJ_NATIVE_TYPES:
                      # If we have a native type, we'll just use it as the state.
          Severity: Minor
          Found in src/pjpersist/serialize.py - About 5 hrs to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          PJContainer has 39 functions (exceeds 20 allowed). Consider refactoring.
          Open

          class PJContainer(contained.Contained,
                            persistent.Persistent,
                            MutableMapping):
              _pj_table = None
              _pj_mapping_key = 'key'
          Severity: Minor
          Found in src/pjpersist/zope/container.py - About 5 hrs to fix

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                @property
                def __name__(self):
                    if self._v_name is None:
                        if self._pj_name_attr is not None:
                            self._v_name = getattr(self, self._pj_name_attr, None)
            Severity: Major
            Found in src/pjpersist/zope/container.py and 1 other location - About 4 hrs to fix
            src/pjpersist/zope/container.py on lines 63..70

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 76.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                @property
                def __parent__(self):
                    if self._v_parent is None:
                        if self._pj_parent_attr is not None:
                            self._v_parent = getattr(self, self._pj_parent_attr, None)
            Severity: Major
            Found in src/pjpersist/zope/container.py and 1 other location - About 4 hrs to fix
            src/pjpersist/zope/container.py on lines 48..55

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 76.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Function operator_expr has a Cognitive Complexity of 24 (exceeds 5 allowed). Consider refactoring.
            Open

                def operator_expr(self, operator, field, key, op2):
                    op1 = self.getField(field, key, json=True)
                    # some values, esp. datetime must go through PJ serialize
                    pjvalue = serialize.ObjectWriter(None).get_state(op2)
                    op2j = json.dumps(pjvalue, sort_keys=True)
            Severity: Minor
            Found in src/pjpersist/mquery.py - About 3 hrs to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Function convert has a Cognitive Complexity of 22 (exceeds 5 allowed). Consider refactoring.
            Open

                def convert(self, query):
                    clauses = []
                    doc = sb.Field(self.table, self.field)
                    for key, value in sorted(query.items()):
                        accessor = self.getField(doc, key, json=True)
            Severity: Minor
            Found in src/pjpersist/mquery.py - About 3 hrs to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Function flush has a Cognitive Complexity of 21 (exceeds 5 allowed). Consider refactoring.
            Open

                def flush(self, flush_hint=None):
                    # flush_hint contains tables that we want to flush, leaving all other
                    # objects registered.
                    #
                    # While writing objects, new sub-objects might be registered
            Severity: Minor
            Found in src/pjpersist/datamanager.py - About 2 hrs to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def link_to_parent(obj, pobj):
                if obj._p_jar is None:
                    if pobj is not None and  getattr(pobj, '_p_jar', None) is not None:
                        obj._p_jar = pobj._p_jar
                    setattr(obj, interfaces.DOC_OBJECT_ATTR_NAME, pobj)
            Severity: Major
            Found in src/pjpersist/serialize.py and 1 other location - About 2 hrs to fix
            src/pjpersist/serialize.py on lines 375..380

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 59.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if getattr(obj, interfaces.SUB_OBJECT_ATTR_NAME, False):
                        if obj._p_jar is None:
                            if pobj is not None and \
                                    getattr(pobj, '_p_jar', None) is not None:
                                obj._p_jar = pobj._p_jar
            Severity: Major
            Found in src/pjpersist/serialize.py and 1 other location - About 2 hrs to fix
            src/pjpersist/serialize.py on lines 121..125

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 59.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

            class UnionAll(Union):
                def __sqlrepr__(self, db):
                    return " UNION ALL ".join(["( %s )" % str(sqlrepr(t, db))
                                              for t in self.tables])
            Severity: Major
            Found in src/pjpersist/sqlbuilder.py and 1 other location - About 1 hr to fix
            src/pjpersist/sqlbuilder.py on lines 274..279

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 45.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Function _get_sb_fields has a Cognitive Complexity of 13 (exceeds 5 allowed). Consider refactoring.
            Open

                def _get_sb_fields(self, fields):
                    """Return sqlbuilder columns for a SELECT query based on
                    passed field names or * if no fields are passed
            
                    Prefers native columns where available
            Severity: Minor
            Found in src/pjpersist/zope/container.py - About 1 hr to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

            class _BetterUnion(Union):
                # We need to enclose queries we union into parenthesis, so queries can have
                # ORDER BY clause.
                def __sqlrepr__(self, db):
                    return " UNION ".join(["( %s )" % str(sqlrepr(t, db))
            Severity: Major
            Found in src/pjpersist/sqlbuilder.py and 1 other location - About 1 hr to fix
            src/pjpersist/sqlbuilder.py on lines 281..284

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 45.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Function resolve has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
            Open

                def resolve(self, dbref):
                    __traceback_info__ = dbref
                    # 1. Try to optimize on whether there's just one class stored in one
                    #    table, that can save us one DB query
                    if dbref.table in TABLE_KLASS_MAP:
            Severity: Minor
            Found in src/pjpersist/serialize.py - About 1 hr to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Similar blocks of code found in 3 locations. Consider refactoring.
            Open

                                if '.' not in name:
                                    accessor = sb.JSON_GETITEM(datafld, name)
                                else:
                                    accessor = sb.JSON_PATH(datafld, name.split("."))
            Severity: Major
            Found in src/pjpersist/zope/container.py and 2 other locations - About 1 hr to fix
            src/pjpersist/mquery.py on lines 84..88
            src/pjpersist/mquery.py on lines 84..93

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 41.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Function get_non_persistent_state has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
            Open

                def get_non_persistent_state(self, obj):
                    objectId = id(obj)
                    objectType = type(obj)
                    __traceback_info__ = obj, objectType, objectId
                    # XXX: Look at the pickle library how to properly handle all types and
            Severity: Minor
            Found in src/pjpersist/serialize.py - About 1 hr to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Severity
            Category
            Status
            Source
            Language