crowbar/crowbar-hadoop

View on GitHub

Showing 509 of 509 total issues

Assignment Branch Condition size for create_proposal is too high. [117.3/30] (https://github.com/SUSE/style-guides/blob/master/Ruby.md#metricsabcsize, http://c2.com/cgi/wiki?AbcMetric)
Open

  def create_proposal
    @logger.debug("hadoop_infrastructure create_proposal: entering")
    base = super

    adminnodes = [] # Crowbar admin node (size=1).

This cop checks that the ABC size of methods is not higher than the configured maximum. The ABC size is based on assignments, branches (method calls), and conditions. See http://c2.com/cgi/wiki?AbcMetric

Assignment Branch Condition size for create_proposal is too high. [95.97/30] (https://github.com/SUSE/style-guides/blob/master/Ruby.md#metricsabcsize, http://c2.com/cgi/wiki?AbcMetric)
Open

  def create_proposal
    @logger.debug("hadoop create_proposal: entering")
    base = super

    # Compute the hadoop cluster node distribution.

This cop checks that the ABC size of methods is not higher than the configured maximum. The ABC size is based on assignments, branches (method calls), and conditions. See http://c2.com/cgi/wiki?AbcMetric

Assignment Branch Condition size for nodes is too high. [79.06/30] (https://github.com/SUSE/style-guides/blob/master/Ruby.md#metricsabcsize, http://c2.com/cgi/wiki?AbcMetric)
Open

  def nodes
    @hadoop_config = @service_object.get_hadoop_config

    respond_to do |format|
      format.html { render template: "barclamp/hadoop_infrastructure/nodes" }

This cop checks that the ABC size of methods is not higher than the configured maximum. The ABC size is based on assignments, branches (method calls), and conditions. See http://c2.com/cgi/wiki?AbcMetric

Method create_proposal has a Cognitive Complexity of 41 (exceeds 5 allowed). Consider refactoring.
Open

  def create_proposal
    @logger.debug("hadoop_infrastructure create_proposal: entering")
    base = super

    adminnodes = [] # Crowbar admin node (size=1).
Severity: Minor
Found in crowbar_framework/app/models/hadoop_infrastructure_service.rb - About 6 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Method create_proposal has a Cognitive Complexity of 39 (exceeds 5 allowed). Consider refactoring.
Open

  def create_proposal
    @logger.debug("hadoop create_proposal: entering")
    base = super

    # Compute the hadoop cluster node distribution.
Severity: Minor
Found in crowbar_framework/app/models/hadoop_service.rb - About 5 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Cyclomatic complexity for create_proposal is too high. [26/6]
Open

  def create_proposal
    @logger.debug("hadoop create_proposal: entering")
    base = super

    # Compute the hadoop cluster node distribution.

This cop checks that the cyclomatic complexity of methods is not higher than the configured maximum. The cyclomatic complexity is the number of linearly independent paths through a method. The algorithm counts decision points and adds one.

An if statement (or unless or ?:) increases the complexity by one. An else branch does not, since it doesn't add a decision point. The && operator (or keyword and) can be converted to a nested if statement, and ||/or is shorthand for a sequence of ifs, so they also add one. Loops can be said to have an exit condition, so they add one.

Cyclomatic complexity for create_proposal is too high. [26/6]
Open

  def create_proposal
    @logger.debug("hadoop_infrastructure create_proposal: entering")
    base = super

    adminnodes = [] # Crowbar admin node (size=1).

This cop checks that the cyclomatic complexity of methods is not higher than the configured maximum. The cyclomatic complexity is the number of linearly independent paths through a method. The algorithm counts decision points and adds one.

An if statement (or unless or ?:) increases the complexity by one. An else branch does not, since it doesn't add a decision point. The && operator (or keyword and) can be converted to a nested if statement, and ||/or is shorthand for a sequence of ifs, so they also add one. Loops can be said to have an exit condition, so they add one.

Perceived complexity for create_proposal is too high. [27/7]
Open

  def create_proposal
    @logger.debug("hadoop create_proposal: entering")
    base = super

    # Compute the hadoop cluster node distribution.

This cop tries to produce a complexity score that's a measure of the complexity the reader experiences when looking at a method. For that reason it considers when nodes as something that doesn't add as much complexity as an if or a &&. Except if it's one of those special case/when constructs where there's no expression after case. Then the cop treats it as an if/elsif/elsif... and lets all the when nodes count. In contrast to the CyclomaticComplexity cop, this cop considers else nodes as adding complexity.

Example:

def my_method                   # 1
  if cond                       # 1
    case var                    # 2 (0.8 + 4 * 0.2, rounded)
    when 1 then func_one
    when 2 then func_two
    when 3 then func_three
    when 4..10 then func_other
    end
  else                          # 1
    do_something until a && b   # 2
  end                           # ===
end                             # 7 complexity points

Perceived complexity for create_proposal is too high. [27/7]
Open

  def create_proposal
    @logger.debug("hadoop_infrastructure create_proposal: entering")
    base = super

    adminnodes = [] # Crowbar admin node (size=1).

This cop tries to produce a complexity score that's a measure of the complexity the reader experiences when looking at a method. For that reason it considers when nodes as something that doesn't add as much complexity as an if or a &&. Except if it's one of those special case/when constructs where there's no expression after case. Then the cop treats it as an if/elsif/elsif... and lets all the when nodes count. In contrast to the CyclomaticComplexity cop, this cop considers else nodes as adding complexity.

Example:

def my_method                   # 1
  if cond                       # 1
    case var                    # 2 (0.8 + 4 * 0.2, rounded)
    when 1 then func_one
    when 2 then func_two
    when 3 then func_three
    when 4..10 then func_other
    end
  else                          # 1
    do_something until a && b   # 2
  end                           # ===
end                             # 7 complexity points

Method has too many lines. [65/50] (https://github.com/bbatsov/ruby-style-guide#short-methods)
Open

  def create_proposal
    @logger.debug("hadoop_infrastructure create_proposal: entering")
    base = super

    adminnodes = [] # Crowbar admin node (size=1).

This cop checks if the length of a method exceeds some maximum value. Comment lines can optionally be ignored. The maximum allowed length is configurable.

Similar blocks of code found in 3 locations. Consider refactoring.
Open

service "hadoop-0.20-datanode" do
  supports start: true, stop: true, status: true, restart: true
  # Subscribe to common configuration change events (default.rb).
  subscribes :restart, resources(directory: node[:hadoop][:env][:hadoop_log_dir])
  subscribes :restart, resources(directory: node[:hadoop][:core][:hadoop_tmp_dir])
Severity: Major
Found in chef/cookbooks/hadoop/recipes/slavenode.rb and 2 other locations - About 2 hrs to fix
chef/cookbooks/hadoop/recipes/secondarynamenode.rb on lines 44..57
chef/cookbooks/hadoop/recipes/slavenode.rb on lines 62..75

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 98.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Similar blocks of code found in 3 locations. Consider refactoring.
Open

service "hadoop-0.20-tasktracker" do
  supports start: true, stop: true, status: true, restart: true
  # Subscribe to common configuration change events (default.rb).
  subscribes :restart, resources(directory: node[:hadoop][:env][:hadoop_log_dir])
  subscribes :restart, resources(directory: node[:hadoop][:core][:hadoop_tmp_dir])
Severity: Major
Found in chef/cookbooks/hadoop/recipes/slavenode.rb and 2 other locations - About 2 hrs to fix
chef/cookbooks/hadoop/recipes/secondarynamenode.rb on lines 44..57
chef/cookbooks/hadoop/recipes/slavenode.rb on lines 45..58

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 98.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Similar blocks of code found in 3 locations. Consider refactoring.
Open

service "hadoop-0.20-secondarynamenode" do
  supports start: true, stop: true, status: true, restart: true
  # Subscribe to common configuration change events (default.rb).
  subscribes :restart, resources(directory: node[:hadoop][:env][:hadoop_log_dir])
  subscribes :restart, resources(directory: node[:hadoop][:core][:hadoop_tmp_dir])
Severity: Major
Found in chef/cookbooks/hadoop/recipes/secondarynamenode.rb and 2 other locations - About 2 hrs to fix
chef/cookbooks/hadoop/recipes/slavenode.rb on lines 45..58
chef/cookbooks/hadoop/recipes/slavenode.rb on lines 62..75

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 98.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Method create_proposal has 65 lines of code (exceeds 25 allowed). Consider refactoring.
Open

  def create_proposal
    @logger.debug("hadoop_infrastructure create_proposal: entering")
    base = super

    adminnodes = [] # Crowbar admin node (size=1).
Severity: Major
Found in crowbar_framework/app/models/hadoop_infrastructure_service.rb - About 2 hrs to fix

Similar blocks of code found in 2 locations. Consider refactoring.
Open

service "hadoop-0.20-jobtracker" do
  supports start: true, stop: true, status: true, restart: true
  # Subscribe to common configuration change events (default.rb).
  subscribes :restart, resources(directory: node[:hadoop][:env][:hadoop_log_dir])
  subscribes :restart, resources(directory: node[:hadoop][:core][:hadoop_tmp_dir])
Severity: Major
Found in chef/cookbooks/hadoop/recipes/masternamenode.rb and 1 other location - About 2 hrs to fix
chef/cookbooks/hadoop/recipes/masternamenode.rb on lines 45..57

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 92.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Similar blocks of code found in 2 locations. Consider refactoring.
Open

service "hadoop-0.20-namenode" do
  supports start: true, stop: true, status: true, restart: true
  # Subscribe to common configuration change events (default.rb).
  subscribes :restart, resources(directory: node[:hadoop][:env][:hadoop_log_dir])
  subscribes :restart, resources(directory: node[:hadoop][:core][:hadoop_tmp_dir])
Severity: Major
Found in chef/cookbooks/hadoop/recipes/masternamenode.rb and 1 other location - About 2 hrs to fix
chef/cookbooks/hadoop/recipes/masternamenode.rb on lines 61..73

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 92.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Similar blocks of code found in 2 locations. Consider refactoring.
Open

  def create_proposal
    @logger.debug("pig create_proposal: entering")
    base = super

    # Get the node list.
Severity: Major
Found in crowbar_framework/app/models/pig_service.rb and 1 other location - About 2 hrs to fix
crowbar_framework/app/models/sqoop_service.rb on lines 28..55

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 91.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Similar blocks of code found in 2 locations. Consider refactoring.
Open

  def create_proposal
    @logger.debug("sqoop create_proposal: entering")
    base = super

    # Get the node list.
Severity: Major
Found in crowbar_framework/app/models/sqoop_service.rb and 1 other location - About 2 hrs to fix
crowbar_framework/app/models/pig_service.rb on lines 28..55

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 91.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Cyclomatic complexity for create_proposal is too high. [7/6]
Open

  def create_proposal
    @logger.debug("hive create_proposal: entering")
    base = super

    # Get the node list.

This cop checks that the cyclomatic complexity of methods is not higher than the configured maximum. The cyclomatic complexity is the number of linearly independent paths through a method. The algorithm counts decision points and adds one.

An if statement (or unless or ?:) increases the complexity by one. An else branch does not, since it doesn't add a decision point. The && operator (or keyword and) can be converted to a nested if statement, and ||/or is shorthand for a sequence of ifs, so they also add one. Loops can be said to have an exit condition, so they add one.

Cyclomatic complexity for create_proposal is too high. [7/6]
Open

  def create_proposal
    @logger.debug("pig create_proposal: entering")
    base = super

    # Get the node list.

This cop checks that the cyclomatic complexity of methods is not higher than the configured maximum. The cyclomatic complexity is the number of linearly independent paths through a method. The algorithm counts decision points and adds one.

An if statement (or unless or ?:) increases the complexity by one. An else branch does not, since it doesn't add a decision point. The && operator (or keyword and) can be converted to a nested if statement, and ||/or is shorthand for a sequence of ifs, so they also add one. Loops can be said to have an exit condition, so they add one.

Severity
Category
Status
Source
Language