KarrLab/obj_tables

View on GitHub

Showing 542 of 542 total issues

File core.py has 7031 lines of code (exceeds 250 allowed). Consider refactoring.
Open

""" Toolkit for modeling complex datasets with collections of user-friendly tables

Many classes contain the methods ``serialize()`` and `deserialize()``, which invert each other.
``serialize()`` converts a python object instance into a string representation, whereas
``deserialize()`` parses an object's string representation -- as would be stored in a file or spreadsheet
Severity: Major
Found in obj_tables/core.py - About 2 wks to fix

    File migrate.py has 2830 lines of code (exceeds 250 allowed). Consider refactoring.
    Open

    """ Support schema migration
    
    :Author: Arthur Goldberg <Arthur.Goldberg@mssm.edu>
    :Date: 2018-11-18
    :Copyright: 2018, Karr Lab
    Severity: Major
    Found in obj_tables/migrate.py - About 1 wk to fix

      File io.py has 2243 lines of code (exceeds 250 allowed). Consider refactoring.
      Open

      """ Reading/writing schema objects to/from files
      
      * Comma separated values (.csv)
      * XLSX (.xlsx)
      * JavaScript Object Notation (.json)
      Severity: Major
      Found in obj_tables/io.py - About 6 days to fix

        File expression.py has 1493 lines of code (exceeds 250 allowed). Consider refactoring.
        Open

        """ Utilities for processing mathematical expressions used by obj_tables models
        
        :Author: Arthur Goldberg <Arthur.Goldberg@mssm.edu>
        :Author: Jonathan Karr <karr@mssm.edu>
        :Date: 2018-12-19
        Severity: Major
        Found in obj_tables/math/expression.py - About 3 days to fix

          Function init_schema has a Cognitive Complexity of 154 (exceeds 5 allowed). Consider refactoring.
          Open

          def init_schema(filename, out_filename=None):
              """ Initialize an `ObjTables` schema from a tabular declarative specification in
              :obj:`filename`. :obj:`filename` can be a XLSX, CSV, or TSV file.
          
              Schemas (classes and attributes) should be defined using the following tabular format.
          Severity: Minor
          Found in obj_tables/utils.py - About 3 days to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Function run has a Cognitive Complexity of 147 (exceeds 5 allowed). Consider refactoring.
          Open

              def run(self, path, schema_name=None, models=None,
                      allow_multiple_sheets_per_model=False,
                      ignore_missing_models=False, ignore_extra_models=False,
                      ignore_sheet_order=False,
                      include_all_attributes=True, ignore_missing_attributes=False, ignore_extra_attributes=False,
          Severity: Minor
          Found in obj_tables/io.py - About 2 days to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Similar blocks of code found in 2 locations. Consider refactoring.
          Open

          class QuantityInfo(obj_tables.Model):
              quantity_type = obj_tables.StringAttribute(verbose_name='QuantityType')
              symbol = obj_tables.StringAttribute(verbose_name='Symbol')
              unit = obj_tables.StringAttribute(verbose_name='Unit')
              constant = obj_tables.StringAttribute(verbose_name='Constant')
          Severity: Major
          Found in examples/sbtab/SBtab.py and 1 other location - About 2 days to fix
          examples/sbtab/SBtab.py on lines 220..265

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 305.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Similar blocks of code found in 2 locations. Consider refactoring.
          Open

          class Enzyme(obj_tables.Model):
              comment = obj_tables.StringAttribute(verbose_name='Comment')
              reference_name = obj_tables.StringAttribute(verbose_name='ReferenceName')
              reference_pub_med = obj_tables.StringAttribute(verbose_name='ReferencePubMed')
              reference_d_o_i = obj_tables.StringAttribute(verbose_name='ReferenceDOI')
          Severity: Major
          Found in examples/sbtab/SBtab.py and 1 other location - About 2 days to fix
          examples/sbtab/SBtab.py on lines 814..859

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 305.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          File SBtab.py has 1031 lines of code (exceeds 250 allowed). Consider refactoring.
          Open

          # Schema automatically generated at 2020-05-29 00:27:56
          
          import obj_tables
          
          
          
          Severity: Major
          Found in examples/sbtab/SBtab.py - About 2 days to fix

            Function import_module_for_migration has a Cognitive Complexity of 116 (exceeds 5 allowed). Consider refactoring.
            Open

                def import_module_for_migration(self, validate=True, required_attrs=None, debug=False,
                                                mod_patterns=None, print_code=False):
                    """ Import a schema from a Python module in a file, which may be in a package
            
                    Args:
            Severity: Minor
            Found in obj_tables/migrate.py - About 2 days to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Function read_model has a Cognitive Complexity of 109 (exceeds 5 allowed). Consider refactoring.
            Open

                def read_model(self, reader, sheet_name, schema_name, model, include_all_attributes=True,
                               ignore_missing_attributes=False, ignore_extra_attributes=False,
                               ignore_attribute_order=False, ignore_empty_rows=True,
                               validate=True):
                    """ Instantiate a list of objects from data in a table in a file
            Severity: Minor
            Found in obj_tables/io.py - About 2 days to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            File utils.py has 881 lines of code (exceeds 250 allowed). Consider refactoring.
            Open

            """ Utilities
            
            :Author: Jonathan Karr <karr@mssm.edu>
            :Author: Arthur Goldberg <Arthur.Goldberg@mssm.edu>
            :Date: 2016-11-23
            Severity: Major
            Found in obj_tables/utils.py - About 2 days to fix

              File core.py has 865 lines of code (exceeds 250 allowed). Consider refactoring.
              Open

              """ Chemistry attributes
              
              :Author: Jonathan Karr <karr@mssm.edu>
              :Date: 2017-05-10
              :Copyright: 2017, Karr Lab
              Severity: Major
              Found in obj_tables/chem/core.py - About 2 days to fix

                Function write_model has a Cognitive Complexity of 94 (exceeds 5 allowed). Consider refactoring.
                Open

                    def write_model(self, writer, model, objects, schema_name, date, doc_metadata, doc_metadata_model, model_metadata, sheet_models,
                                    include_all_attributes=True, encoded=None, write_empty_models=True, write_empty_cols=True,
                                    extra_entries=0, protected=True):
                        """ Write a list of model objects to a file
                
                
                Severity: Minor
                Found in obj_tables/io.py - About 1 day to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                Function from_dict has a Cognitive Complexity of 89 (exceeds 5 allowed). Consider refactoring.
                Open

                    def from_dict(json, models, decode_primary_objects=True, primary_objects=None, decoded=None, ignore_extra_models=False,
                                  validate=False, output_format=None):
                        """ Decode a simple Python representation (dict, list, str, float, bool, None) of an object that
                        is compatible with JSON and YAML, including references to objects through :obj:`__id` keys.
                
                
                Severity: Minor
                Found in obj_tables/core.py - About 1 day to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                    def get_xlsx_validation(self, sheet_models=None, doc_metadata_model=None):
                        """ Get XLSX validation
                
                        Args:
                            sheet_models (:obj:`list` of :obj:`Model`, optional): models encoded as separate sheets
                Severity: Major
                Found in obj_tables/chem/core.py and 1 other location - About 1 day to fix
                obj_tables/chem/core.py on lines 1031..1067

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 193.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                    def get_xlsx_validation(self, sheet_models=None, doc_metadata_model=None):
                        """ Get XLSX validation
                
                        Args:
                            sheet_models (:obj:`list` of :obj:`Model`, optional): models encoded as separate sheets
                Severity: Major
                Found in obj_tables/chem/core.py and 1 other location - About 1 day to fix
                obj_tables/chem/core.py on lines 159..195

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 193.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Function viz_schema has a Cognitive Complexity of 79 (exceeds 5 allowed). Consider refactoring.
                Open

                def viz_schema(module, filename, attributes=True, tail_labels=True, hidden_classes=None, extra_edges=None,
                               model_names=None,
                               rank_sep=None,
                               node_sep=None,
                               node_width=None,
                Severity: Minor
                Found in obj_tables/utils.py - About 1 day to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                        if objects and self.species_cls:
                            if not isinstance(self.species_cls, type):
                                for cls in objects.keys():
                                    if '.' in self.species_cls:
                                        if cls.__module__ + '.' + cls.__name__ == self.species_cls:
                Severity: Major
                Found in obj_tables/chem/core.py and 1 other location - About 1 day to fix
                obj_tables/chem/core.py on lines 924..944

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 167.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                        if objects and self.compartment_cls:
                            if not isinstance(self.compartment_cls, type):
                                for cls in objects.keys():
                                    if '.' in self.compartment_cls:
                                        if cls.__module__ + '.' + cls.__name__ == self.compartment_cls:
                Severity: Major
                Found in obj_tables/chem/core.py and 1 other location - About 1 day to fix
                obj_tables/chem/core.py on lines 902..922

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 167.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Severity
                Category
                Status
                Source
                Language