datopian/metastore-lib

View on GitHub

Showing 14 of 14 total issues

File storage.py has 270 lines of code (exceeds 250 allowed). Consider refactoring.
Open

# coding=utf-8
"""Github Storage Backend implementation

This backend stores datasets as GitHub repositories, utilizing Git's built-in
revisions and tags. This implementation is based on GitHub's Web API, and will
Severity: Minor
Found in metastore/backend/github/storage.py - About 2 hrs to fix

    File filesystem.py has 259 lines of code (exceeds 250 allowed). Consider refactoring.
    Open

    """Pyfilesystem based versioned metadata storage
    
    This is useful especially for testing and POC implementations
    """
    import hashlib
    Severity: Minor
    Found in metastore/backend/filesystem.py - About 2 hrs to fix

      FilesystemStorage has 21 functions (exceeds 20 allowed). Consider refactoring.
      Open

      class FilesystemStorage(StorageBackend):
          """Abstract filesystem based storage based on PyFilesystem
      
          This storage backend is useful mostly in testing, especially with the
          'mem://' file system. You most likely shouldn't be using it in production,
      Severity: Minor
      Found in metastore/backend/filesystem.py - About 2 hrs to fix

        GitHubStorage has 21 functions (exceeds 20 allowed). Consider refactoring.
        Open

        class GitHubStorage(StorageBackend):
            """GitHub based metadata storage
            """
        
            DEFAULT_README = ('# ¯\\_(ツ)_/¯\n'
        Severity: Minor
        Found in metastore/backend/github/storage.py - About 2 hrs to fix

          Similar blocks of code found in 2 locations. Consider refactoring.
          Open

                  try:
                      package_dir = self._fs.makedirs(_get_package_path(package_id))
                  except DirectoryExists:
                      raise exc.Conflict("Package with id {} already exists".format(package_id))
          Severity: Major
          Found in metastore/backend/filesystem.py and 1 other location - About 1 hr to fix
          metastore/backend/filesystem.py on lines 266..269

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 39.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Similar blocks of code found in 2 locations. Consider refactoring.
          Open

                  try:
                      package_dir = self._fs.opendir(_get_package_path(package_id))
                  except ResourceNotFound:
                      raise exc.NotFound('Could not find package {}'.format(package_id))
          Severity: Major
          Found in metastore/backend/filesystem.py and 1 other location - About 1 hr to fix
          metastore/backend/filesystem.py on lines 45..48

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 39.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Function _create_lfs_files has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
          Open

              def _create_lfs_files(self, datapackage):
                  # type: (Dict[str, Any]) -> List[gh.InputGitTreeElement]
                  """Create LFS pointer files and config files, if we need to
          
                  :raise ValueError: If resources with conflicting file names are found
          Severity: Minor
          Found in metastore/backend/github/storage.py - About 45 mins to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Function create has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
          Open

              def create(self, package_id, metadata, author=None, message=None):
                  owner, repo_name = self._parse_id(package_id)
                  datapackage = _create_file('datapackage.json', json.dumps(metadata, indent=2))
                  files = [datapackage] + self._create_lfs_files(metadata)
          
          
          Severity: Minor
          Found in metastore/backend/github/storage.py - About 45 mins to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Function _create_tag has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
          Open

              def _create_tag(self, repo, name, description, revision_ref, author):
                  # type: (gh.Repository, str, str, str, Optional[Author]) -> gh.GitTag.GitTag
                  """Low level operations for creating a git tag
                  """
                  author = self._verify_author(author)
          Severity: Minor
          Found in metastore/backend/github/storage.py - About 25 mins to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Function tag_update has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
          Open

              def tag_update(self, package_id, tag, author=None, new_name=None, new_description=None):
                  if new_name is None and new_description is None:
                      raise ValueError("Expecting at least one of new_name or new_description to be specified")
          
                  repo = self._get_repo(package_id)
          Severity: Minor
          Found in metastore/backend/github/storage.py - About 25 mins to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Function fetch has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
          Open

              def fetch(self, package_id, revision_ref=None, repo=None):
                  if repo is None:
                      repo = self._get_repo(package_id)
                  try:
                      if not revision_ref:
          Severity: Minor
          Found in metastore/backend/github/storage.py - About 25 mins to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Try, Except, Pass detected.
          Open

                      except Exception:

          Use of assert detected. The enclosed code will be removed when compiling to optimised byte code.
          Open

              assert isinstance(ref, str), ref

          Use of assert detected. The enclosed code will be removed when compiling to optimised byte code.
          Open

                          assert ref.object.type == 'commit'
          Severity
          Category
          Status
          Source
          Language