iterative/dvc

View on GitHub

Showing 547 of 589 total issues

Avoid too many return statements within this function.
Open

        return path
Severity: Major
Found in dvc/config.py - About 30 mins to fix

    Avoid too many return statements within this function.
    Open

        return path or default, name
    Severity: Major
    Found in dvc/utils/__init__.py - About 30 mins to fix

      Avoid too many return statements within this function.
      Open

              return self.fs.relpath(self.fs_path, self.repo.root_dir)
      Severity: Major
      Found in dvc/output.py - About 30 mins to fix

        Similar blocks of code found in 2 locations. Consider refactoring.
        Open

        @contextmanager
        def modify_toml(path, fs=None):
            with _modify_data(path, parse_toml_for_update, _dump, fs=fs) as d:
                yield d
        Severity: Minor
        Found in dvc/utils/serialize/_toml.py and 1 other location - About 30 mins to fix
        dvc/utils/serialize/_json.py on lines 30..33

        Duplicated Code

        Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

        Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

        When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

        Tuning

        This issue has a mass of 32.

        We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

        The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

        If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

        See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

        Refactorings

        Further Reading

        Avoid too many return statements within this function.
        Open

                return 0
        Severity: Major
        Found in dvc/commands/status.py - About 30 mins to fix

          Avoid too many return statements within this function.
          Open

                      return 1
          Severity: Major
          Found in dvc/commands/plots.py - About 30 mins to fix

            Avoid too many return statements within this function.
            Open

                        return 0
            Severity: Major
            Found in dvc/commands/plots.py - About 30 mins to fix

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

              @contextmanager
              def modify_json(path, fs=None):
                  with _modify_data(path, parse_json, _dump_json, fs=fs) as d:
                      yield d
              Severity: Minor
              Found in dvc/utils/serialize/_json.py and 1 other location - About 30 mins to fix
              dvc/utils/serialize/_toml.py on lines 48..51

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 32.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Avoid too many return statements within this function.
              Open

                              return 0
              Severity: Major
              Found in dvc/commands/plots.py - About 30 mins to fix

                Avoid too many return statements within this function.
                Open

                                    return ui.open_browser(output_file)
                Severity: Major
                Found in dvc/commands/plots.py - About 30 mins to fix

                  Avoid too many return statements within this function.
                  Open

                          return _val
                  Severity: Major
                  Found in dvc/compare.py - About 30 mins to fix

                    Avoid too many return statements within this function.
                    Open

                            return self._set(remote_or_db, section, opt)
                    Severity: Major
                    Found in dvc/commands/config.py - About 30 mins to fix

                      Identical blocks of code found in 2 locations. Consider refactoring.
                      Open

                              for file in out.files:
                                  assert file["version_id"]
                                  assert file["remote"] == "upstream"
                      Severity: Minor
                      Found in dvc/testing/remote_tests.py and 1 other location - About 30 mins to fix
                      dvc/testing/remote_tests.py on lines 228..230

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 32.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Identical blocks of code found in 2 locations. Consider refactoring.
                      Open

                              for file in out.files:
                                  assert file["version_id"]
                                  assert file["remote"] == "upstream"
                      Severity: Minor
                      Found in dvc/testing/remote_tests.py and 1 other location - About 30 mins to fix
                      dvc/testing/remote_tests.py on lines 321..323

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 32.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Function celery_remove has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                      Open

                      def celery_remove(self: "LocalCeleryQueue", revs: Collection[str]) -> list[str]:
                          """Remove the specified entries from the queue.
                      
                          Arguments:
                              revs: Stash revisions or queued exp names to be removed.
                      Severity: Minor
                      Found in dvc/repo/experiments/queue/remove.py - About 25 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function prepare_default_pager has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                      Open

                      def prepare_default_pager(
                          clear_screen: bool = False,
                          quit_if_one_screen: bool = True,
                          ansi_escapes: bool = True,
                          chop_long_lines: bool = True,
                      Severity: Minor
                      Found in dvc/ui/pager.py - About 25 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function _get_flags has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                      Open

                      def _get_flags(out):
                          annot = out.annot.to_dict()
                          yield from annot.items()
                      
                          if not out.use_cache:
                      Severity: Minor
                      Found in dvc/stage/serialize.py - About 25 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function _infer_y_from_data has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def _infer_y_from_data(self):
                              if self.plot_id in self.data:
                                  for lst in _lists(self.data[self.plot_id]):
                                      if all(isinstance(item, dict) for item in lst):
                                          datapoint = first(lst)
                      Severity: Minor
                      Found in dvc/render/converter/vega.py - About 25 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function kill has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def kill(self, revs: Collection[str], force: bool = False) -> None:
                              name_dict: dict[str, Optional[QueueEntry]] = self.match_queue_entry_by_name(
                                  set(revs), self.iter_active()
                              )
                      
                      
                      Severity: Minor
                      Found in dvc/repo/experiments/queue/celery.py - About 25 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Function collect_failed_data has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                      Open

                          def collect_failed_data(
                              self,
                              baseline_revs: Optional[Collection[str]],
                              **kwargs,
                          ) -> dict[str, list["ExpRange"]]:
                      Severity: Minor
                      Found in dvc/repo/experiments/queue/celery.py - About 25 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      Severity
                      Category
                      Status
                      Source
                      Language