durandtibo/karbonn

View on GitHub

Showing 20 of 23 total issues

File module.py has 550 lines of code (exceeds 250 allowed). Consider refactoring.
Open

r"""Contain functionalities to analyze a ``torch.nn.Module``."""

from __future__ import annotations

__all__ = [
Severity: Major
Found in src/karbonn/utils/summary/module.py - About 1 day to fix

    File cos_sin.py has 364 lines of code (exceeds 250 allowed). Consider refactoring.
    Open

    r"""Contain modules to encode scalar values with cosine and sine
    representations."""
    
    from __future__ import annotations
    
    
    Severity: Minor
    Found in src/karbonn/modules/scalar/cos_sin.py - About 4 hrs to fix

      Function find_module_state_dict has a Cognitive Complexity of 16 (exceeds 5 allowed). Consider refactoring.
      Open

      def find_module_state_dict(state_dict: dict | list | tuple | set, module_keys: set) -> dict:
          r"""Try to find automatically the part of the state dict related to a
          module.
      
          The user should specify the set of module's keys:
      Severity: Minor
      Found in src/karbonn/utils/state_dict.py - About 2 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Identical blocks of code found in 2 locations. Consider refactoring.
      Open

      __all__ = [
          "Asinh",
          "AsinhCosSinScalarEncoder",
          "AsinhMSELoss",
          "AsinhScalarEncoder",
      Severity: Major
      Found in src/karbonn/__init__.py and 1 other location - About 1 hr to fix
      src/karbonn/modules/__init__.py on lines 5..45

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 46.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Identical blocks of code found in 2 locations. Consider refactoring.
      Open

      __all__ = [
          "Asinh",
          "AsinhCosSinScalarEncoder",
          "AsinhMSELoss",
          "AsinhScalarEncoder",
      Severity: Major
      Found in src/karbonn/modules/__init__.py and 1 other location - About 1 hr to fix
      src/karbonn/__init__.py on lines 5..45

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 46.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Function general_robust_regression_loss has 7 arguments (exceeds 4 allowed). Consider refactoring.
      Open

      def general_robust_regression_loss(
      Severity: Major
      Found in src/karbonn/functional/loss/general_robust.py - About 50 mins to fix

        Function create_rand_frequency has 5 arguments (exceeds 4 allowed). Consider refactoring.
        Open

            def create_rand_frequency(
        Severity: Minor
        Found in src/karbonn/modules/scalar/cos_sin.py - About 35 mins to fix

          Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
          Open

              def __init__(
          Severity: Minor
          Found in src/karbonn/modules/loss/general_robust.py - About 35 mins to fix

            Function create_logspace_frequency has 5 arguments (exceeds 4 allowed). Consider refactoring.
            Open

                def create_logspace_frequency(
            Severity: Minor
            Found in src/karbonn/modules/scalar/cos_sin.py - About 35 mins to fix

              Function create_logspace_value_range has 5 arguments (exceeds 4 allowed). Consider refactoring.
              Open

                  def create_logspace_value_range(
              Severity: Minor
              Found in src/karbonn/modules/scalar/cos_sin.py - About 35 mins to fix

                Function create_rand_value_range has 5 arguments (exceeds 4 allowed). Consider refactoring.
                Open

                    def create_rand_value_range(
                Severity: Minor
                Found in src/karbonn/modules/scalar/cos_sin.py - About 35 mins to fix

                  Function create_linspace_value_range has 5 arguments (exceeds 4 allowed). Consider refactoring.
                  Open

                      def create_linspace_value_range(
                  Severity: Minor
                  Found in src/karbonn/modules/scalar/cos_sin.py - About 35 mins to fix

                    Function create_linspace_frequency has 5 arguments (exceeds 4 allowed). Consider refactoring.
                    Open

                        def create_linspace_frequency(
                    Severity: Minor
                    Found in src/karbonn/modules/scalar/cos_sin.py - About 35 mins to fix

                      Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
                      Open

                          def __init__(
                      Severity: Minor
                      Found in src/karbonn/modules/exu.py - About 35 mins to fix

                        Function create_linspace_scale has 5 arguments (exceeds 4 allowed). Consider refactoring.
                        Open

                            def create_linspace_scale(
                        Severity: Minor
                        Found in src/karbonn/modules/scalar/asinh.py - About 35 mins to fix

                          Function create_logspace_scale has 5 arguments (exceeds 4 allowed). Consider refactoring.
                          Open

                              def create_logspace_scale(
                          Severity: Minor
                          Found in src/karbonn/modules/scalar/asinh.py - About 35 mins to fix

                            Function get_named_modules has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
                            Open

                            def get_named_modules(
                                module: nn.Module, depth: int = 0
                            ) -> Generator[tuple[str, nn.Module], None, None]:
                                r"""Return an iterator over the modules, yielding both the name of
                                the module as well as the module itself.
                            Severity: Minor
                            Found in src/karbonn/utils/iterator.py - About 35 mins to fix

                            Cognitive Complexity

                            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                            A method's cognitive complexity is based on a few simple rules:

                            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                            • Code is considered more complex for each "break in the linear flow of the code"
                            • Code is considered more complex when "flow breaking structures are nested"

                            Further reading

                            Function find_in_features has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                            Open

                                def find_in_features(self, module: nn.ModuleList) -> list[int]:
                                    sizes = set()
                                    for child in module:
                                        with contextlib.suppress(SizeNotFoundError):
                                            sizes.add(tuple(find_in_features(child)))
                            Severity: Minor
                            Found in src/karbonn/utils/size/list.py - About 25 mins to fix

                            Cognitive Complexity

                            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                            A method's cognitive complexity is based on a few simple rules:

                            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                            • Code is considered more complex for each "break in the linear flow of the code"
                            • Code is considered more complex when "flow breaking structures are nested"

                            Further reading

                            Function find_out_features has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                            Open

                                def find_out_features(self, module: nn.ModuleList) -> list[int]:
                                    sizes = set()
                                    for child in module:
                                        with contextlib.suppress(SizeNotFoundError):
                                            sizes.add(tuple(find_out_features(child)))
                            Severity: Minor
                            Found in src/karbonn/utils/size/list.py - About 25 mins to fix

                            Cognitive Complexity

                            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                            A method's cognitive complexity is based on a few simple rules:

                            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                            • Code is considered more complex for each "break in the linear flow of the code"
                            • Code is considered more complex when "flow breaking structures are nested"

                            Further reading

                            Function merge_size_dtype has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                            Open

                            def merge_size_dtype(
                                sizes: list[torch.Size | Sequence[torch.Size] | Mapping[str, torch.Size]],
                                dtypes: list[torch.dtype | Sequence[torch.dtype] | Mapping[str, torch.dtype]],
                            ) -> list[str]:
                                r"""Return joined string representations of the sizes and data types.
                            Severity: Minor
                            Found in src/karbonn/utils/summary/module.py - About 25 mins to fix

                            Cognitive Complexity

                            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                            A method's cognitive complexity is based on a few simple rules:

                            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                            • Code is considered more complex for each "break in the linear flow of the code"
                            • Code is considered more complex when "flow breaking structures are nested"

                            Further reading

                            Severity
                            Category
                            Status
                            Source
                            Language