DeepRegNet/DeepReg

View on GitHub

Showing 113 of 113 total issues

File layer.py has 520 lines of code (exceeds 250 allowed). Consider refactoring.
Open

"""This module defines custom layers."""
import itertools
from typing import List, Tuple, Union

import numpy as np
Severity: Major
Found in deepreg/model/layer.py - About 1 day to fix

    File interface.py has 513 lines of code (exceeds 250 allowed). Consider refactoring.
    Open

    """
    Interface between the data loaders and file loaders.
    """
    
    from abc import ABC
    Severity: Major
    Found in deepreg/dataset/loader/interface.py - About 1 day to fix

      File u_net.py has 508 lines of code (exceeds 250 allowed). Consider refactoring.
      Open

      # coding=utf-8
      
      from typing import List, Optional, Tuple, Union
      
      import tensorflow as tf
      Severity: Major
      Found in deepreg/model/backbone/u_net.py - About 1 day to fix

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        Severity: Major
        Found in demos/classical_ct_headneck_affine/demo_data.py and 1 other location - About 1 day to fix
        demos/classical_mr_prostate_nonrigid/demo_data.py on lines 0..24

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 124.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        Severity: Major
        Found in demos/classical_mr_prostate_nonrigid/demo_data.py and 1 other location - About 1 day to fix
        demos/classical_ct_headneck_affine/demo_data.py on lines 0..24

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 124.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        File network.py has 476 lines of code (exceeds 250 allowed). Consider refactoring.
        Open

        import os
        from abc import abstractmethod
        from copy import deepcopy
        from typing import Dict, Optional, Tuple
        
        
        Severity: Minor
        Found in deepreg/model/network.py - About 7 hrs to fix

          Similar blocks of code found in 3 locations. Consider refactoring.
          Open

          for folder in os.listdir(path_to_test):
              path_to_folder = os.path.join(path_to_test, folder)
              os.chdir(path_to_folder)
              for file in os.listdir(path_to_folder):
                  if "_insp" in file:
          Severity: Major
          Found in demos/unpaired_ct_lung/demo_data.py and 2 other locations - About 7 hrs to fix
          demos/unpaired_ct_lung/demo_data.py on lines 158..170
          demos/unpaired_ct_lung/demo_data.py on lines 190..202

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 114.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Similar blocks of code found in 3 locations. Consider refactoring.
          Open

          for folder in os.listdir(path_to_valid):
              path_to_folder = os.path.join(path_to_valid, folder)
              os.chdir(path_to_folder)
              for file in os.listdir(path_to_folder):
                  if "_insp" in file:
          Severity: Major
          Found in demos/unpaired_ct_lung/demo_data.py and 2 other locations - About 7 hrs to fix
          demos/unpaired_ct_lung/demo_data.py on lines 158..170
          demos/unpaired_ct_lung/demo_data.py on lines 174..186

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 114.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Similar blocks of code found in 3 locations. Consider refactoring.
          Open

          for folder in os.listdir(path_to_train):
              path_to_folder = os.path.join(path_to_train, folder)
              os.chdir(path_to_folder)
              for file in os.listdir(path_to_folder):
                  if "_insp" in file:
          Severity: Major
          Found in demos/unpaired_ct_lung/demo_data.py and 2 other locations - About 7 hrs to fix
          demos/unpaired_ct_lung/demo_data.py on lines 174..186
          demos/unpaired_ct_lung/demo_data.py on lines 190..202

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 114.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          File preprocess.py has 398 lines of code (exceeds 250 allowed). Consider refactoring.
          Open

          """
          Module containing data augmentation techniques.
            - 3D Affine/DDF Transforms for moving and fixed images.
          """
          
          
          Severity: Minor
          Found in deepreg/dataset/preprocess.py - About 5 hrs to fix

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

            for f in img_files:
                num_subject = int(f.split("_")[1].split(".")[0])
            
                if num_subject < 311:
                    shutil.copy(join(path_to_init_img, f), join(path_to_train, "images"))
            Severity: Major
            Found in demos/unpaired_mr_brain/demo_data.py and 1 other location - About 5 hrs to fix
            demos/unpaired_mr_brain/demo_data.py on lines 112..117

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 90.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

            for f in img_files:
                num_subject = int(f.split("_")[1].split(".")[0])
                if num_subject < 311:
                    shutil.copy(join(path_to_init_label, f), join(path_to_train, "labels"))
                else:
            Severity: Major
            Found in demos/unpaired_mr_brain/demo_data.py and 1 other location - About 5 hrs to fix
            demos/unpaired_mr_brain/demo_data.py on lines 103..109

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 90.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            File layer_util.py has 361 lines of code (exceeds 250 allowed). Consider refactoring.
            Open

            """
            Module containing utilities for layer inputs
            """
            import itertools
            from typing import List, Tuple, Union
            Severity: Minor
            Found in deepreg/model/layer_util.py - About 4 hrs to fix

              Function get_intra_sample_indices has a Cognitive Complexity of 27 (exceeds 5 allowed). Consider refactoring.
              Open

                  def get_intra_sample_indices(self) -> list:
                      """
                      Calculate the sample indices for intra-group sampling
                      The index to identify a sample is (group1, image1, group2, image2), means
                      - image1 of group1 is moving image
              Severity: Minor
              Found in deepreg/dataset/loader/grouped_loader.py - About 3 hrs to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              File predict.py has 326 lines of code (exceeds 250 allowed). Consider refactoring.
              Open

              # coding=utf-8
              
              """
              Module to perform predictions on data using
              command line interface.
              Severity: Minor
              Found in deepreg/predict.py - About 3 hrs to fix

                Function predict_on_dataset has a Cognitive Complexity of 26 (exceeds 5 allowed). Consider refactoring.
                Open

                def predict_on_dataset(
                    dataset: tf.data.Dataset,
                    fixed_grid_ref: tf.Tensor,
                    model: tf.keras.Model,
                    save_dir: str,
                Severity: Minor
                Found in deepreg/predict.py - About 3 hrs to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                File vis.py has 320 lines of code (exceeds 250 allowed). Consider refactoring.
                Open

                """
                Module to generate visualisations of data
                at command line interface.
                Requires ffmpeg writer to write gif files
                """
                Severity: Minor
                Found in deepreg/vis.py - About 3 hrs to fix

                  File label.py has 305 lines of code (exceeds 250 allowed). Consider refactoring.
                  Open

                  """Provide different loss or metrics classes for labels."""
                  
                  import tensorflow as tf
                  
                  from deepreg.constant import EPS
                  Severity: Minor
                  Found in deepreg/loss/label.py - About 3 hrs to fix

                    Function sample_index_generator has a Cognitive Complexity of 18 (exceeds 5 allowed). Consider refactoring.
                    Open

                        def sample_index_generator(self):
                            """
                            Yield (moving_index, fixed_index, image_indices) sequentially, where
                    
                              - moving_index = (group1, image1)
                    Severity: Minor
                    Found in deepreg/dataset/loader/grouped_loader.py - About 2 hrs to fix

                    Cognitive Complexity

                    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                    A method's cognitive complexity is based on a few simple rules:

                    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                    • Code is considered more complex for each "break in the linear flow of the code"
                    • Code is considered more complex when "flow breaking structures are nested"

                    Further reading

                    File registry.py has 256 lines of code (exceeds 250 allowed). Consider refactoring.
                    Open

                    from copy import deepcopy
                    from typing import Any, Callable, Dict, Optional
                    
                    BACKBONE_CLASS = "backbone_class"
                    LOSS_CLASS = "loss_class"
                    Severity: Minor
                    Found in deepreg/registry.py - About 2 hrs to fix
                      Severity
                      Category
                      Status
                      Source
                      Language