Showing 1,218 of 1,218 total issues

File flavors.rb has 4629 lines of code (exceeds 250 allowed). Consider refactoring.
Open

require 'fog/aws/models/compute/flavor'

# To compute RAM from AWS doc https://aws.amazon.com/fr/ec2/instance-types
# we can use this formula: RAM (in MB) = AWS_RAM (in GiB) * 1073.742 MB/GiB
module Fog
Severity: Major
Found in lib/fog/aws/models/compute/flavors.rb - About 1 wk to fix

    Method put_bucket_lifecycle has a Cognitive Complexity of 129 (exceeds 5 allowed). Consider refactoring.
    Open

            def put_bucket_lifecycle(bucket_name, lifecycle)
              builder = Nokogiri::XML::Builder.new do
                LifecycleConfiguration {
                  lifecycle['Rules'].each do |rule|
                    Rule {
    Severity: Minor
    Found in lib/fog/aws/requests/storage/put_bucket_lifecycle.rb - About 2 days to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    File storage.rb has 676 lines of code (exceeds 250 allowed). Consider refactoring.
    Open

    module Fog
      module AWS
        class Storage < Fog::Service
          extend Fog::AWS::CredentialFetcher::ServiceMethods
    
    
    Severity: Major
    Found in lib/fog/aws/storage.rb - About 1 day to fix

      Method change_resource_record_sets has a Cognitive Complexity of 67 (exceeds 5 allowed). Consider refactoring.
      Open

              def change_resource_record_sets(zone_id, change_batch, options = {})
                response = Excon::Response.new
                errors   = []
      
      
      
      Severity: Minor
      Found in lib/fog/aws/requests/dns/change_resource_record_sets.rb - About 1 day to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      File compute.rb has 587 lines of code (exceeds 250 allowed). Consider refactoring.
      Open

      module Fog
        module AWS
          class Compute < Fog::Service
            extend Fog::AWS::CredentialFetcher::ServiceMethods
      
      
      Severity: Major
      Found in lib/fog/aws/compute.rb - About 1 day to fix

        Method get_object has a Cognitive Complexity of 59 (exceeds 5 allowed). Consider refactoring.
        Open

                def get_object(bucket_name, object_name, options = {}, &block)
                  version_id = options.delete('versionId')
        
                  unless bucket_name
                    raise ArgumentError.new('bucket_name is required')
        Severity: Minor
        Found in lib/fog/aws/requests/storage/get_object.rb - About 1 day to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Method end_element has a Cognitive Complexity of 51 (exceeds 5 allowed). Consider refactoring.
        Open

                  def end_element(name)
                    case name
                    when 'AutoMinorVersionUpgrade', 'CacheClusterId',
                      'CacheClusterStatus', 'CacheNodeType', 'Engine',
                      'PreferredAvailabilityZone', 'PreferredMaintenanceWindow'
        Severity: Minor
        Found in lib/fog/aws/parsers/elasticache/cache_cluster_parser.rb - About 7 hrs to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Method associate_address has a Cognitive Complexity of 50 (exceeds 5 allowed). Consider refactoring.
        Open

                def associate_address(*args)
                  if args.first.kind_of? Hash
                    params = args.first
                  else
                    params = {
        Severity: Minor
        Found in lib/fog/aws/requests/compute/associate_address.rb - About 7 hrs to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        File policy_types.rb has 477 lines of code (exceeds 250 allowed). Consider refactoring.
        Open

        class Fog::AWS::ELB::Mock
          POLICY_TYPES = [{
            "Description" => "",
            "PolicyAttributeTypeDescriptions" => [{
              "AttributeName"=>"CookieName",
        Severity: Minor
        Found in lib/fog/aws/elb/policy_types.rb - About 7 hrs to fix

          Method describe_db_security_groups has a Cognitive Complexity of 43 (exceeds 5 allowed). Consider refactoring.
          Open

                  def describe_db_security_groups(opts={})
                    response = Excon::Response.new
                    sec_group_set = []
                    if opts.is_a?(String)
                      sec_group_name = opts
          Severity: Minor
          Found in lib/fog/aws/requests/rds/describe_db_security_groups.rb - About 6 hrs to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Method end_element has a Cognitive Complexity of 42 (exceeds 5 allowed). Consider refactoring.
          Open

                    def end_element(name)
                      if @in_instanceType
                        case name
                        when 'value'
                          @response['instanceType'] = value
          Severity: Minor
          Found in lib/fog/aws/parsers/compute/describe_instance_attribute.rb - About 6 hrs to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Similar blocks of code found in 2 locations. Consider refactoring.
          Open

            module AWS
              class CDN
                class Real
                  require 'fog/aws/parsers/cdn/streaming_distribution'
          
          
          lib/fog/aws/requests/cdn/put_distribution_config.rb on lines 2..110

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 206.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Similar blocks of code found in 2 locations. Consider refactoring.
          Open

            module AWS
              class CDN
                class Real
                  require 'fog/aws/parsers/cdn/distribution'
          
          
          Severity: Major
          Found in lib/fog/aws/requests/cdn/put_distribution_config.rb and 1 other location - About 6 hrs to fix
          lib/fog/aws/requests/cdn/put_streaming_distribution_config.rb on lines 2..99

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 206.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Method end_element has a Cognitive Complexity of 41 (exceeds 5 allowed). Consider refactoring.
          Open

                    def end_element(name)
                      if @in_tag_set
                        case name
                          when 'item'
                            @security_group['tagSet'][@tag['key']] = @tag['value']
          Severity: Minor
          Found in lib/fog/aws/parsers/compute/describe_security_groups.rb - About 6 hrs to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Method describe_instances has 154 lines of code (exceeds 25 allowed). Consider refactoring.
          Open

                  def describe_instances(filters = {})
                    unless filters.is_a?(Hash)
                      Fog::Logger.deprecation("describe_instances with #{filters.class} param is deprecated, use describe_instances('instance-id' => []) instead [light_black](#{caller.first})[/]")
                      filters = {'instance-id' => [*filters]}
                    end
          Severity: Major
          Found in lib/fog/aws/requests/compute/describe_instances.rb - About 6 hrs to fix

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

              module Parsers
                module AWS
                  module ELBV2
                    class DescribeLoadBalancers < Fog::Parsers::Base
                      def reset
            Severity: Major
            Found in lib/fog/aws/parsers/elbv2/describe_load_balancers.rb and 1 other location - About 6 hrs to fix
            lib/fog/aws/parsers/elbv2/create_load_balancer.rb on lines 2..81

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 202.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

              module Parsers
                module AWS
                  module ELBV2
                    class CreateLoadBalancer < Fog::Parsers::Base
                      def reset
            Severity: Major
            Found in lib/fog/aws/parsers/elbv2/create_load_balancer.rb and 1 other location - About 6 hrs to fix
            lib/fog/aws/parsers/elbv2/describe_load_balancers.rb on lines 2..81

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 202.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Method delete_object has a Cognitive Complexity of 38 (exceeds 5 allowed). Consider refactoring.
            Open

                    def delete_object(bucket_name, object_name, options = {})
                      response = Excon::Response.new
                      if bucket = self.data[:buckets][bucket_name]
                        response.status = 204
            
            
            Severity: Minor
            Found in lib/fog/aws/requests/storage/delete_object.rb - About 5 hrs to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Method create_db_instance has a Cognitive Complexity of 38 (exceeds 5 allowed). Consider refactoring.
            Open

                    def create_db_instance(db_name, options={})
                      response = Excon::Response.new
                      if self.data[:servers] and self.data[:servers][db_name]
                        # I don't know how to raise an exception that contains the excon data
                        #response.status = 400
            Severity: Minor
            Found in lib/fog/aws/requests/rds/create_db_instance.rb - About 5 hrs to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

              module AWS
                class IAM
                  class Real
                    require 'fog/aws/parsers/iam/list_managed_policies'
            
            
            Severity: Major
            Found in lib/fog/aws/requests/iam/list_attached_role_policies.rb and 1 other location - About 5 hrs to fix
            lib/fog/aws/requests/iam/list_attached_group_policies.rb on lines 2..84

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 187.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Severity
            Category
            Status
            Source
            Language