borgbackup/borg

View on GitHub

Showing 611 of 611 total issues

Function get_args has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
Open

    def get_args(self, argv, cmd):
        """usually, just returns argv, except if we deal with a ssh forced command for borg serve."""
        result = self.parse_args(argv[1:])
        if cmd is not None and result.func == self.do_serve:
            # borg serve case:
Severity: Minor
Found in src/borg/archiver/__init__.py - About 1 hr to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function sig_info_handler has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
Open

def sig_info_handler(sig_no, stack):  # pragma: no cover
    """search the stack for infos about the currently processed file and print them"""
    with signal_handler(sig_no, signal.SIG_IGN):
        for frame in inspect.getouterframes(stack):
            func, loc = frame[3], frame[0].f_locals
Severity: Minor
Found in src/borg/archiver/__init__.py - About 1 hr to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Similar blocks of code found in 2 locations. Consider refactoring.
Open

        repo_list_epilog = (
            process_epilog(
                """
        This command lists the archives contained in a repository.

Severity: Major
Found in src/borg/archiver/repo_list_cmd.py and 1 other location - About 1 hr to fix
src/borg/archiver/list_cmd.py on lines 48..89

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 41.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Function _read_files_cache has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
Open

    def _read_files_cache(self):
        """read files cache from cache directory"""
        if "d" in self.cache_mode:  # d(isabled)
            return

Severity: Minor
Found in src/borg/cache.py - About 1 hr to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Similar blocks of code found in 2 locations. Consider refactoring.
Open

        list_epilog = (
            process_epilog(
                """
        This command lists the contents of an archive.

Severity: Major
Found in src/borg/archiver/list_cmd.py and 1 other location - About 1 hr to fix
src/borg/archiver/repo_list_cmd.py on lines 45..86

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 41.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Function release has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
Open

    def release(self):
        if not self.is_locked():
            raise NotLocked(self.path)
        if not self.by_me():
            raise NotMyLock(self.path)
Severity: Minor
Found in src/borg/fslocking.py - About 1 hr to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function _process_archive has a Cognitive Complexity of 11 (exceeds 5 allowed). Consider refactoring.
Open

    def _process_archive(self, archive_id, prefix=[]):
        """Build FUSE inode hierarchy from archive metadata"""
        self.file_versions = {}  # for versions mode: original path -> version
        t0 = time.perf_counter()
        archive = Archive(self._manifest, archive_id)
Severity: Minor
Found in src/borg/fuse.py - About 1 hr to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function __init__ has 30 lines of code (exceeds 25 allowed). Consider refactoring.
Open

    def __init__(
        self,
        manifest,
        name,
        *,
Severity: Minor
Found in src/borg/archive.py - About 1 hr to fix

    Function check has 30 lines of code (exceeds 25 allowed). Consider refactoring.
    Open

        def check(self, repair=False, max_duration=0):
            """Check repository consistency
    
            This method verifies all segment checksums and makes sure
            the index is consistent with the data stored in the segments.
    Severity: Minor
    Found in src/borg/legacyrepository.py - About 1 hr to fix

      Similar blocks of code found in 2 locations. Consider refactoring.
      Open

              with ExclusiveLock(lockpath, id=cant_know_if_dead_id):
                  with pytest.raises(LockTimeout):
                      ExclusiveLock(lockpath, id=our_id, timeout=0.1).acquire()
      Severity: Major
      Found in src/borg/testsuite/fslocking_test.py and 1 other location - About 1 hr to fix
      src/borg/testsuite/fslocking_test.py on lines 73..76

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 40.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 2 locations. Consider refactoring.
      Open

                  elif modebits == stat.S_IFCHR:
                      tarinfo.type = tarfile.CHRTYPE
                      tarinfo.devmajor = os.major(item.rdev)
                      tarinfo.devminor = os.minor(item.rdev)
      Severity: Major
      Found in src/borg/archiver/tar_cmds.py and 1 other location - About 1 hr to fix
      src/borg/archiver/tar_cmds.py on lines 172..175

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 40.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 2 locations. Consider refactoring.
      Open

                  elif modebits == stat.S_IFBLK:
                      tarinfo.type = tarfile.BLKTYPE
                      tarinfo.devmajor = os.major(item.rdev)
                      tarinfo.devminor = os.minor(item.rdev)
      Severity: Major
      Found in src/borg/archiver/tar_cmds.py and 1 other location - About 1 hr to fix
      src/borg/archiver/tar_cmds.py on lines 176..179

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 40.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 2 locations. Consider refactoring.
      Open

          def test_timeout(self, lockpath):
              with ExclusiveLock(lockpath, id=ID1):
                  with pytest.raises(LockTimeout):
                      ExclusiveLock(lockpath, id=ID2, timeout=0.1).acquire()
      Severity: Major
      Found in src/borg/testsuite/fslocking_test.py and 1 other location - About 1 hr to fix
      src/borg/testsuite/fslocking_test.py on lines 90..92

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 40.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Function open has 9 arguments (exceeds 4 allowed). Consider refactoring.
      Open

          def open(
      Severity: Major
      Found in src/borg/remote.py - About 1 hr to fix

        Function __init__ has 9 arguments (exceeds 4 allowed). Consider refactoring.
        Open

            def __init__(
        Severity: Major
        Found in src/borg/remote.py - About 1 hr to fix

          Function process_file has 9 arguments (exceeds 4 allowed). Consider refactoring.
          Open

              def process_file(self, *, path, parent_fd, name, st, cache, flags=flags_normal, last_try=False, strip_prefix):
          Severity: Major
          Found in src/borg/archive.py - About 1 hr to fix

            Function __init__ has 9 arguments (exceeds 4 allowed). Consider refactoring.
            Open

                def __init__(
            Severity: Major
            Found in src/borg/legacyremote.py - About 1 hr to fix

              Function open has 9 arguments (exceeds 4 allowed). Consider refactoring.
              Open

                  def open(
              Severity: Major
              Found in src/borg/legacyremote.py - About 1 hr to fix

                Function setup_logging has 28 lines of code (exceeds 25 allowed). Consider refactoring.
                Open

                def setup_logging(
                    stream=None, conf_fname=None, env_var="BORG_LOGGING_CONF", level="info", is_serve=False, log_json=False, func=None
                ):
                    """setup logging module according to the arguments provided
                
                
                Severity: Minor
                Found in src/borg/logger.py - About 1 hr to fix

                  Function test_date_matching has 28 lines of code (exceeds 25 allowed). Consider refactoring.
                  Open

                  def test_date_matching(archivers, request):
                      archiver = request.getfixturevalue(archivers)
                      check_cmd_setup(archiver)
                  
                      shutil.rmtree(archiver.repository_path)
                  Severity: Minor
                  Found in src/borg/testsuite/archiver/check_cmd_test.py - About 1 hr to fix
                    Severity
                    Category
                    Status
                    Source
                    Language