RackHD/on-taskgraph

View on GitHub
data/templates/secure_erase.py

Summary

Maintainability
D
2 days
Test Coverage

File secure_erase.py has 599 lines of code (exceeds 250 allowed). Consider refactoring.
Open

#!/usr/bin/env python

# Copyright 2016-2018, Dell EMC, Inc.

# -*- coding: UTF-8 -*-
Severity: Major
Found in data/templates/secure_erase.py - About 1 day to fix

    Function __run has a Cognitive Complexity of 20 (exceeds 5 allowed). Consider refactoring.
    Open

        def __run(self):
            """
            Get secure erase progress for secure erase task.
            """
            parser_mapper = {
    Severity: Minor
    Found in data/templates/secure_erase.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function create_jbod has a Cognitive Complexity of 18 (exceeds 5 allowed). Consider refactoring.
    Open

    def create_jbod(disk_arg, raid_tool):
        """
        Create JBOD for each physical disk under a virtual disk.
        :param disk_arg: a dictionary contains disk argument
        :param raid_tool: tools used for JBOD creation, storcli and perccli are supported
    Severity: Minor
    Found in data/templates/secure_erase.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function convert_raid_to_jbod has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
    Open

    def convert_raid_to_jbod():
        """
        To delete RAID and create JBOD for each physical disk of a virtual disk with RAID
        :rtype : list
        :return: a string includes all the disks to be erased
    Severity: Minor
    Found in data/templates/secure_erase.py - About 55 mins to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function mark_on_disk has 5 arguments (exceeds 4 allowed). Consider refactoring.
    Open

    def mark_on_disk(disk_name, log, flag, back_skip, mark_files):
    Severity: Minor
    Found in data/templates/secure_erase.py - About 35 mins to fix

      Function __sg_requests_parser has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
      Open

          def __sg_requests_parser(self, drive):
              """
              Secure erase job progress parser for sg_format and sg_sanitize tools.
              :param drive: drive name
              :return: a float digital of percentage
      Severity: Minor
      Found in data/templates/secure_erase.py - About 35 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function __get_hdparm_duration has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
      Open

          def __get_hdparm_duration(self, log):
              """
              Get hdparm required secure erase time.
              :param log: a file object of secure erase log
              :return: required secure erase time indicated by hdparm tool
      Severity: Minor
      Found in data/templates/secure_erase.py - About 35 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function get_disk_size has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
      Open

      def get_disk_size(disk_name, log, mark_files):
          """
          Get disk size and create empty mark files
          :param disk_name: disk name that be copied data to.
          :param log: an opened file object to store stdout and stderr
      Severity: Minor
      Found in data/templates/secure_erase.py - About 25 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function mark_on_disk has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
      Open

      def mark_on_disk(disk_name, log, flag, back_skip, mark_files):
          """
          Copy 512 Bytes random data to specified disk address as a mark.
          Or to read the marks from disk for verification
          :param disk_name: disk name that be copied data to.
      Severity: Minor
      Found in data/templates/secure_erase.py - About 25 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Identical blocks of code found in 2 locations. Consider refactoring.
      Open

              for command in commands:
                  exit_status = robust_check_call(command, log)
                  assert exit_status["exit_code"] == 0, "Command [ %s ] failed" % " ".join(command)
      Severity: Major
      Found in data/templates/secure_erase.py and 1 other location - About 1 hr to fix
      data/templates/secure_erase.py on lines 487..489

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 40.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Identical blocks of code found in 2 locations. Consider refactoring.
      Open

              for command in commands[3:5]:
                  exit_status = robust_check_call(command, log)
                  assert exit_status["exit_code"] == 0, "Command [ %s ] failed" % " ".join(command)
      Severity: Major
      Found in data/templates/secure_erase.py and 1 other location - About 1 hr to fix
      data/templates/secure_erase.py on lines 478..480

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 40.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Identical blocks of code found in 2 locations. Consider refactoring.
      Open

          try:
              output = subprocess.check_output(cmd, shell=False, stderr=log)
          except subprocess.CalledProcessError as exc:
              exit_status["message"] = exc.output
              exit_status["exit_code"] = exc.returncode
      Severity: Minor
      Found in data/templates/secure_erase.py and 1 other location - About 40 mins to fix
      data/templates/secure_erase.py on lines 393..397

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 34.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Identical blocks of code found in 2 locations. Consider refactoring.
      Open

          try:
              exit_code = subprocess.check_call(cmd, shell=False, stdout=log, stderr=log)
          except subprocess.CalledProcessError as exc:
              exit_status["message"] = exc.output
              exit_status["exit_code"] = exc.returncode
      Severity: Minor
      Found in data/templates/secure_erase.py and 1 other location - About 40 mins to fix
      data/templates/secure_erase.py on lines 415..419

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 34.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Identical blocks of code found in 2 locations. Consider refactoring.
      Open

          log.write(COMMAND_LOG_MARKER + "[" + " ".join(cmd) + "] output:\n")
      Severity: Minor
      Found in data/templates/secure_erase.py and 1 other location - About 35 mins to fix
      data/templates/secure_erase.py on lines 391..391

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 33.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Identical blocks of code found in 2 locations. Consider refactoring.
      Open

          log.write(COMMAND_LOG_MARKER + "[" + " ".join(cmd) + "] output:\n")
      Severity: Minor
      Found in data/templates/secure_erase.py and 1 other location - About 35 mins to fix
      data/templates/secure_erase.py on lines 413..413

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 33.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      There are no issues that match your filters.

      Category
      Status