enclose-io/compiler

View on GitHub
lts/deps/v8/tools/callstats.py

Summary

Maintainability
F
2 mos
Test Coverage

File callstats.py has 655 lines of code (exceeds 250 allowed). Consider refactoring.
Open

#!/usr/bin/env python
# Copyright 2016 the V8 project authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
'''
Severity: Major
Found in lts/deps/v8/tools/callstats.py - About 1 day to fix

    Function run_site has a Cognitive Complexity of 42 (exceeds 5 allowed). Consider refactoring.
    Open

    def run_site(site, domain, args, timeout=None):
      print("="*80)
      print("RUNNING DOMAIN %s" % domain)
      print("="*80)
      result_template = "{domain}#{count}.txt" if args.repeat else "{domain}.txt"
    Severity: Minor
    Found in lts/deps/v8/tools/callstats.py - About 6 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function read_stats has a Cognitive Complexity of 32 (exceeds 5 allowed). Consider refactoring.
    Open

    def read_stats(path, domain, args):
      groups = [];
      if args.aggregate:
        groups = [
            ('Group-IC', re.compile(".*IC_.*")),
    Severity: Minor
    Found in lts/deps/v8/tools/callstats.py - About 4 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function _read_logs has a Cognitive Complexity of 25 (exceeds 5 allowed). Consider refactoring.
    Open

    def _read_logs(args):
      versions = {}
      for path in args.logdirs:
        if os.path.isdir(path):
          for root, dirs, files in os.walk(path):
    Severity: Minor
    Found in lts/deps/v8/tools/callstats.py - About 3 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function print_stats has a Cognitive Complexity of 19 (exceeds 5 allowed). Consider refactoring.
    Open

    def print_stats(S, args):
      # Sort by ascending/descending time average, then by ascending/descending
      # count average, then by ascending name.
      def sort_asc_func(item):
        return (item[1]['time_stat']['average'],
    Severity: Minor
    Found in lts/deps/v8/tools/callstats.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function do_run has a Cognitive Complexity of 19 (exceeds 5 allowed). Consider refactoring.
    Open

    def do_run(args):
      sites = read_sites(args)
      replay_server = start_replay_server(args, sites) if args.replay_wpr else None
      # Disambiguate domains, if needed.
      L = []
    Severity: Minor
    Found in lts/deps/v8/tools/callstats.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function create_total_page_stats has a Cognitive Complexity of 17 (exceeds 5 allowed). Consider refactoring.
    Open

    def create_total_page_stats(domains, args):
      total = {}
      def sum_up(parent, key, other):
        sums = parent[key]
        for i, item in enumerate(other[key]):
    Severity: Minor
    Found in lts/deps/v8/tools/callstats.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function do_json has a Cognitive Complexity of 16 (exceeds 5 allowed). Consider refactoring.
    Open

    def do_json(args):
      versions = _read_logs(args)
    
      for version, domains in versions.items():
        if args.aggregate:
    Severity: Minor
    Found in lts/deps/v8/tools/callstats.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function read_sites_file has a Cognitive Complexity of 13 (exceeds 5 allowed). Consider refactoring.
    Open

    def read_sites_file(args):
      try:
        sites = []
        try:
          with open(args.sites_file, "rt") as f:
    Severity: Minor
    Found in lts/deps/v8/tools/callstats.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function main has 39 lines of code (exceeds 25 allowed). Consider refactoring.
    Open

    def main():
      parser = argparse.ArgumentParser()
      subparser_adder = parser.add_subparsers(title="commands", dest="command",
                                              metavar="<command>")
      subparsers = {}
    Severity: Minor
    Found in lts/deps/v8/tools/callstats.py - About 1 hr to fix

      Function do_stats has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
      Open

      def do_stats(args):
        domains = {}
        for path in args.logfiles:
          filename = os.path.basename(path)
          m = re.match(r'^([^#]+)(#.*)?$', filename)
      Severity: Minor
      Found in lts/deps/v8/tools/callstats.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function do_raw_json has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
      Open

      def do_raw_json(args):
        versions = _read_logs(args)
      
        for version, domains in versions.items():
          if args.aggregate:
      Severity: Minor
      Found in lts/deps/v8/tools/callstats.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Avoid deeply nested control flow statements.
      Open

                if not regexp.match(key): continue
                entries[group_name]['time'] += time
      Severity: Major
      Found in lts/deps/v8/tools/callstats.py - About 45 mins to fix

        Avoid deeply nested control flow statements.
        Open

                    with open(result, "at") as f:
                      print(file=f)
                      print("URL: {}".format(site), file=f)
                  retries_since_good_run = 0
        Severity: Major
        Found in lts/deps/v8/tools/callstats.py - About 45 mins to fix

          Avoid deeply nested control flow statements.
          Open

                    if filename.endswith(".txt"):
                      m = re.match(r'^([^#]+)(#.*)?\.txt$', filename)
                      domain = m.group(1)
                      if domain not in versions[version]: versions[version][domain] = {}
                      read_stats(os.path.join(root, filename),
          Severity: Major
          Found in lts/deps/v8/tools/callstats.py - About 45 mins to fix

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def main():
              parser = argparse.ArgumentParser()
              subparser_adder = parser.add_subparsers(title="commands", dest="command",
                                                      metavar="<command>")
              subparsers = {}
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 1 wk to fix
            current/deps/v8/tools/callstats.py on lines 633..774

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 1174.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def print_stats(S, args):
              # Sort by ascending/descending time average, then by ascending/descending
              # count average, then by ascending name.
              def sort_asc_func(item):
                return (item[1]['time_stat']['average'],
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 1 wk to fix
            current/deps/v8/tools/callstats.py on lines 439..493

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 805.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

            def read_stats(path, domain, args):
              groups = [];
              if args.aggregate:
                groups = [
                    ('Group-IC', re.compile(".*IC_.*")),
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 1 wk to fix
            current/deps/v8/tools/callstats.py on lines 371..436

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 747.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def run_site(site, domain, args, timeout=None):
              print("="*80)
              print("RUNNING DOMAIN %s" % domain)
              print("="*80)
              result_template = "{domain}#{count}.txt" if args.repeat else "{domain}.txt"
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 6 days to fix
            current/deps/v8/tools/callstats.py on lines 165..232

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 629.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def do_run(args):
              sites = read_sites(args)
              replay_server = start_replay_server(args, sites) if args.replay_wpr else None
              # Disambiguate domains, if needed.
              L = []
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 3 days to fix
            current/deps/v8/tools/callstats.py on lines 266..303

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 340.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def create_total_page_stats(domains, args):
              total = {}
              def sum_up(parent, key, other):
                sums = parent[key]
                for i, item in enumerate(other[key]):
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 2 days to fix
            current/deps/v8/tools/callstats.py on lines 521..548

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 286.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def do_stats(args):
              domains = {}
              for path in args.logfiles:
                filename = os.path.basename(path)
                m = re.match(r'^([^#]+)(#.*)?$', filename)
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 2 days to fix
            current/deps/v8/tools/callstats.py on lines 496..517

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 268.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def read_sites_file(args):
              try:
                sites = []
                try:
                  with open(args.sites_file, "rt") as f:
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 2 days to fix
            current/deps/v8/tools/callstats.py on lines 235..257

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 241.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def do_json(args):
              versions = _read_logs(args)
            
              for version, domains in versions.items():
                if args.aggregate:
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 2 days to fix
            current/deps/v8/tools/callstats.py on lines 593..612

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 233.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def start_replay_server(args, sites, discard_output=True):
              with tempfile.NamedTemporaryFile(prefix='callstats-inject-', suffix='.js',
                                               mode='wt', delete=False) as f:
                injection = f.name
                generate_injection(f, sites, args.refresh)
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 2 days to fix
            current/deps/v8/tools/callstats.py on lines 54..80

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 222.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def _read_logs(args):
              versions = {}
              for path in args.logdirs:
                if os.path.isdir(path):
                  for root, dirs, files in os.walk(path):
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 1 day to fix
            current/deps/v8/tools/callstats.py on lines 552..567

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 186.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def do_run_replay_server(args):
              sites = read_sites(args)
              print("- " * 40)
              print("Available URLs:")
              for site in sites:
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 1 day to fix
            current/deps/v8/tools/callstats.py on lines 306..323

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 178.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

              if N > 1:
                # evaluate sample variance by setting delta degrees of freedom (ddof) to
                # 1. The degree used in calculations is N - ddof
                stddev = numpy.std(data, ddof=1)
                # Get the endpoints of the range that contains 95% of the distribution
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 1 day to fix
            current/deps/v8/tools/callstats.py on lines 338..353

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 157.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def do_raw_json(args):
              versions = _read_logs(args)
            
              for version, domains in versions.items():
                if args.aggregate:
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 1 day to fix
            current/deps/v8/tools/callstats.py on lines 569..588

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 144.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def print_command(cmd_args):
              def fix_for_printing(arg):
                m = re.match(r'^--([^=]+)=(.*)$', arg)
                if m and (' ' in m.group(2) or m.group(2).startswith('-')):
                  arg = "--{}='{}'".format(m.group(1), m.group(2))
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 1 day to fix
            current/deps/v8/tools/callstats.py on lines 43..51

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 129.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def add_category_total(entries, groups, category_prefix):
              group_data = { 'time': 0, 'count': 0 }
              for group_name, regexp in groups:
                if not group_name.startswith('Group-' + category_prefix): continue
                group_data['time'] += entries[group_name]['time']
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 1 day to fix
            current/deps/v8/tools/callstats.py on lines 362..368

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 122.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

              if abs(stddev) > 0.0001 and abs(average) > 0.0001:
                ci['perc'] = t_bounds[1] * stddev / sqrt(N) / average * 100
              else:
                ci['perc'] = 0
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 5 hrs to fix
            current/deps/v8/tools/callstats.py on lines 354..357

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 86.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def do_help(parser, subparsers, args):
              if args.help_cmd:
                if args.help_cmd in subparsers:
                  subparsers[args.help_cmd].print_help()
                else:
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 3 hrs to fix
            current/deps/v8/tools/callstats.py on lines 617..624

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 67.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def get_chrome_flags(js_flags, user_data_dir, arg_delimiter=""):
              return [
                  "--no-default-browser-check",
                  "--no-sandbox",
                  "--disable-translate",
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 2 hrs to fix
            current/deps/v8/tools/callstats.py on lines 134..146

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 56.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def get_chrome_replay_flags(args, arg_delimiter=""):
              http_port = 4080 + args.port_offset
              https_port = 4443 + args.port_offset
              return [
                  "--host-resolver-rules=%sMAP *:80 localhost:%s, "  \
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 2 hrs to fix
            current/deps/v8/tools/callstats.py on lines 149..162

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 56.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def stop_replay_server(server):
              print("SHUTTING DOWN REPLAY SERVER %s" % server['process'].pid)
              server['process'].terminate()
              os.remove(server['injection'])
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 2 hrs to fix
            current/deps/v8/tools/callstats.py on lines 83..86

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 55.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def coexist(*l):
              given = sum(1 for x in l if x)
              return given == 0 or given == len(l)
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 1 hr to fix
            current/deps/v8/tools/callstats.py on lines 629..631

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 47.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

            def read_sites(args):
              # Determine the websites to benchmark.
              if args.sites_file:
                return read_sites_file(args)
              return [{'url': site, 'timeout': args.timeout} for site in args.sites]
            Severity: Major
            Found in lts/deps/v8/tools/callstats.py and 1 other location - About 1 hr to fix
            current/deps/v8/tools/callstats.py on lines 260..264

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 40.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            There are no issues that match your filters.

            Category
            Status