hackedteam/vector-edk

View on GitHub
AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py

Summary

Maintainability
F
1 wk
Test Coverage

File spark.py has 581 lines of code (exceeds 250 allowed). Consider refactoring.
Open

#  Copyright (c) 1998-2002 John Aycock
#
#  Permission is hereby granted, free of charge, to any person obtaining
#  a copy of this software and associated documentation files (the
#  "Software"), to deal in the Software without restriction, including
Severity: Major
Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 1 day to fix

    Function makeSet_fast has a Cognitive Complexity of 61 (exceeds 5 allowed). Consider refactoring.
    Open

        def makeSet_fast(self, token, sets, i):
            #
            #  Call *only* when the entire state machine has been built!
            #  It relies on self.edges being filled in completely, and
            #  then duplicates and inlines code to boost speed at the
    Severity: Minor
    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 1 day to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function makeState has a Cognitive Complexity of 31 (exceeds 5 allowed). Consider refactoring.
    Open

        def makeState(self, state, sym):
            assert sym is not None
            #
            #  Compute \epsilon-kernel state's core and see if
            #  it exists already.
    Severity: Minor
    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 4 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function makeSet has a Cognitive Complexity of 30 (exceeds 5 allowed). Consider refactoring.
    Open

        def makeSet(self, token, sets, i):
            cur, next = sets[i], sets[i+1]
    
            ttype = token is not None and self.typestring(token) or None
            if ttype is not None:
    Severity: Minor
    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 4 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function computeNull has a Cognitive Complexity of 28 (exceeds 5 allowed). Consider refactoring.
    Open

        def computeNull(self):
            self.nullable = {}
            tbd = []
    
            for rulelist in self.rules.values():
    Severity: Minor
    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 4 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    GenericParser has 29 functions (exceeds 20 allowed). Consider refactoring.
    Open

    class GenericParser:
        #
        #  An Earley parser, as per J. Earley, "An Efficient Context-Free
        #  Parsing Algorithm", CACM 13(2), pp. 94-102.  Also J. C. Earley,
        #  "An Efficient Context-Free Parsing Algorithm", Ph.D. thesis,
    Severity: Minor
    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 3 hrs to fix

      Function makeNewRules has a Cognitive Complexity of 18 (exceeds 5 allowed). Consider refactoring.
      Open

          def makeNewRules(self):
              worklist = []
              for rulelist in self.rules.values():
                  for rule in rulelist:
                      worklist.append((rule, 0, 1, rule))
      Severity: Minor
      Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 2 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function __getstate__ has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
      Open

          def __getstate__(self):
              if self.ruleschanged:
                  #
                  #  XXX - duplicated from parse()
                  #
      Severity: Minor
      Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function buildTree has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
      Open

          def buildTree(self, nt, item, tokens, k):
              state, parent = item
      
              choices = []
              for rule in self.states[state].complete:
      Severity: Minor
      Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function parse has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
      Open

          def parse(self, tokens):
              sets = [ [(1,0), (2,0)] ]
              self.links = {}
      
              if self.ruleschanged:
      Severity: Minor
      Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 55 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function addRule has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
      Open

          def addRule(self, doc, func, _preprocess=1):
              fn = func
              rules = string.split(doc)
      
              index = []
      Severity: Minor
      Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 55 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _namelist has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
      Open

      def _namelist(instance):
          namelist, namedict, classlist = [], {}, [instance.__class__]
          for c in classlist:
              for b in c.__bases__:
                  classlist.append(b)
      Severity: Minor
      Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 45 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Avoid deeply nested control flow statements.
      Open

                              for prule in rules[nextSym]:
                                  ppos = self.skip(prule)
                                  new = (prule, ppos)
                                  NK.items.append(new)
                  #
      Severity: Major
      Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 45 mins to fix

        Avoid deeply nested control flow statements.
        Open

                                if nk is not None:
                                    self.add(cur, (nk, i))
        
        Severity: Major
        Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 45 mins to fix

          Avoid deeply nested control flow statements.
          Open

                                  if nk is not None:
                                      self.add(next, (nk, i+1))
          
          Severity: Major
          Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 45 mins to fix

            Avoid deeply nested control flow statements.
            Open

                                    if nk is not None:
                                        #self.add(cur, (nk, i))
                                        #INLINED --v
                                        new = (nk, i)
                                        if new not in cur:
            Severity: Major
            Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 45 mins to fix

              Function _dump has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
              Open

              def _dump(tokens, sets, states):
                  for i in range(len(sets)):
                      print 'set', i
                      for item in sets[i]:
                          print '\t', item
              Severity: Minor
              Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 45 mins to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              Avoid deeply nested control flow statements.
              Open

                                      if new not in cur:
                                          self.links[key] = []
                                          cur.append(new)
                                      self.links[key].append((pptr, why))
              Severity: Major
              Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 45 mins to fix

                Function tokenize has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
                Open

                    def tokenize(self, s):
                        pos = 0
                        n = len(s)
                        while pos < n:
                            m = self.re.match(s, pos)
                Severity: Minor
                Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 45 mins to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                Avoid deeply nested control flow statements.
                Open

                                        if new not in next:
                                            next.append(new)
                                        #INLINED --^
                            else:
                Severity: Major
                Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 45 mins to fix

                  Function add has 5 arguments (exceeds 4 allowed). Consider refactoring.
                  Open

                      def add(self, set, item, i=None, predecessor=None, causal=None):
                  Severity: Minor
                  Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 35 mins to fix

                    Function preorder has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                    Open

                        def preorder(self, node=None):
                            if node is None:
                                node = self.ast
                    
                            try:
                    Severity: Minor
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 25 mins to fix

                    Cognitive Complexity

                    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                    A method's cognitive complexity is based on a few simple rules:

                    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                    • Code is considered more complex for each "break in the linear flow of the code"
                    • Code is considered more complex when "flow breaking structures are nested"

                    Further reading

                    Function add has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                    Open

                        def add(self, set, item, i=None, predecessor=None, causal=None):
                            if predecessor is None:
                                if item not in set:
                                    set.append(item)
                            else:
                    Severity: Minor
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py - About 25 mins to fix

                    Cognitive Complexity

                    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                    A method's cognitive complexity is based on a few simple rules:

                    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                    • Code is considered more complex for each "break in the linear flow of the code"
                    • Code is considered more complex when "flow breaking structures are nested"

                    Further reading

                    Identical blocks of code found in 2 locations. Consider refactoring.
                    Open

                            if self.ruleschanged:
                                self.computeNull()
                                self.newrules = {}
                                self.new2old = {}
                                self.makeNewRules()
                    Severity: Major
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py and 1 other location - About 4 hrs to fix
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 122..133

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 83.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    Identical blocks of code found in 2 locations. Consider refactoring.
                    Open

                            if self.ruleschanged:
                                #
                                #  XXX - duplicated from parse()
                                #
                                self.computeNull()
                    Severity: Major
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py and 1 other location - About 4 hrs to fix
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 305..313

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 83.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    Similar blocks of code found in 2 locations. Consider refactoring.
                    Open

                                    if self.newrules.has_key(lhs):
                                        self.newrules[lhs].append(rule)
                                    else:
                                        self.newrules[lhs] = [ rule ]
                    Severity: Major
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py and 1 other location - About 1 hr to fix
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 190..193

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 45.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    Similar blocks of code found in 2 locations. Consider refactoring.
                    Open

                                if self.rules.has_key(lhs):
                                    self.rules[lhs].append(rule)
                                else:
                                    self.rules[lhs] = [ rule ]
                    Severity: Major
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py and 1 other location - About 1 hr to fix
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 288..291

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 45.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    Identical blocks of code found in 2 locations. Consider refactoring.
                    Open

                                if hasattr(self, name):
                                    func = getattr(self, name)
                                    func(node)
                                else:
                                    self.default(node)
                    Severity: Minor
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py and 1 other location - About 50 mins to fix
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 763..767

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 36.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    Identical blocks of code found in 2 locations. Consider refactoring.
                    Open

                            if hasattr(self, name):
                                func = getattr(self, name)
                                func(node)
                            else:
                                self.default(node)
                    Severity: Minor
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py and 1 other location - About 50 mins to fix
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 739..743

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 36.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    Similar blocks of code found in 3 locations. Consider refactoring.
                    Open

                                            if new not in cur:
                                                self.links[key] = []
                                                cur.append(new)
                    Severity: Minor
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py and 2 other locations - About 30 mins to fix
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 466..468
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 528..530

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 32.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    Similar blocks of code found in 3 locations. Consider refactoring.
                    Open

                                        if new not in next:
                                            self.links[key] = []
                                            next.append(new)
                    Severity: Minor
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py and 2 other locations - About 30 mins to fix
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 466..468
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 569..571

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 32.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    Similar blocks of code found in 3 locations. Consider refactoring.
                    Open

                                if item not in set:
                                    self.links[key] = []
                                    set.append(item)
                    Severity: Minor
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py and 2 other locations - About 30 mins to fix
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 528..530
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 569..571

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 32.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    Similar blocks of code found in 2 locations. Consider refactoring.
                    Open

                            rebind = lambda func, self=self: \
                                            lambda args, func=func, self=self: \
                                                    self.foundMatch(args, func)
                    Severity: Minor
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py and 1 other location - About 30 mins to fix
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 688..690

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 32.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    Similar blocks of code found in 2 locations. Consider refactoring.
                    Open

                            rebind = lambda lhs, self=self: \
                                            lambda args, lhs=lhs, self=self: \
                                                    self.buildASTNode(args, lhs)
                    Severity: Minor
                    Found in AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py and 1 other location - About 30 mins to fix
                    AppPkg/Applications/Python/Python-2.7.2/Parser/spark.py on lines 786..788

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 32.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    There are no issues that match your filters.

                    Category
                    Status