hackedteam/vector-edk

View on GitHub
BaseTools/Source/Python/Workspace/MetaFileParser.py

Summary

Maintainability
F
2 mos
Test Coverage

File MetaFileParser.py has 1354 lines of code (exceeds 250 allowed). Consider refactoring.
Open

## @file
# This file is used to parse meta files
#
# Copyright (c) 2008 - 2012, Intel Corporation. All rights reserved.<BR>
# This program and the accompanying materials
Severity: Major
Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 3 days to fix

    Function Start has a Cognitive Complexity of 63 (exceeds 5 allowed). Consider refactoring.
    Open

        def Start(self):
            NmakeLine = ''
            Content = ''
            try:
                Content = open(str(self.MetaFile), 'r').readlines()
    Severity: Minor
    Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 1 day to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function ParseMacro has a Cognitive Complexity of 36 (exceeds 5 allowed). Consider refactoring.
    Open

    def ParseMacro(Parser):
        def MacroParser(self):
            Match = gMacroDefPattern.match(self._CurrentLine)
            if not Match:
                # Not 'DEFINE/EDK_GLOBAL' statement, call decorated method
    Severity: Minor
    Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 5 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function __ProcessDirective has a Cognitive Complexity of 35 (exceeds 5 allowed). Consider refactoring.
    Open

        def __ProcessDirective(self):
            Result = None
            if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                                  MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF]:
                Macros = self._Macros
    Severity: Minor
    Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 5 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function _PostProcess has a Cognitive Complexity of 32 (exceeds 5 allowed). Consider refactoring.
    Open

        def _PostProcess(self):
            Processer = {
                MODEL_META_DATA_SECTION_HEADER                  :   self.__ProcessSectionHeader,
                MODEL_META_DATA_SUBSECTION_HEADER               :   self.__ProcessSubsectionHeader,
                MODEL_META_DATA_HEADER                          :   self.__ProcessDefine,
    Severity: Minor
    Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 4 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function Start has a Cognitive Complexity of 26 (exceeds 5 allowed). Consider refactoring.
    Open

        def Start(self):
            Content = ''
            try:
                Content = open(str(self.MetaFile), 'r').readlines()
            except:
    Severity: Minor
    Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 3 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function _SectionHeaderParser has a Cognitive Complexity of 24 (exceeds 5 allowed). Consider refactoring.
    Open

        def _SectionHeaderParser(self):
            self._Scope = []
            self._SectionName = ''
            ArchList = set()
            for Item in GetSplitValueList(self._CurrentLine[1:-1], TAB_COMMA_SPLIT):
    Severity: Minor
    Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 3 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    DscParser has 26 functions (exceeds 20 allowed). Consider refactoring.
    Open

    class DscParser(MetaFileParser):
        # DSC file supported data types (one type per section)
        DataType = {
            TAB_SKUIDS.upper()                          :   MODEL_EFI_SKU_ID,
            TAB_LIBRARIES.upper()                       :   MODEL_EFI_LIBRARY_INSTANCE,
    Severity: Minor
    Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 3 hrs to fix

      Function _SectionHeaderParser has a Cognitive Complexity of 21 (exceeds 5 allowed). Consider refactoring.
      Open

          def _SectionHeaderParser(self):
              self._Scope = []
              self._SectionName = ''
              self._SectionType = []
              ArchList = set()
      Severity: Minor
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 2 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _GetApplicableSectionMacro has a Cognitive Complexity of 18 (exceeds 5 allowed). Consider refactoring.
      Open

          def _GetApplicableSectionMacro(self):
              Macros = {}
      
              ComComMacroDict = {}
              ComSpeMacroDict = {}
      Severity: Minor
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 2 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function Start has a Cognitive Complexity of 17 (exceeds 5 allowed). Consider refactoring.
      Open

          def Start(self):
              Content = ''
              try:
                  Content = open(str(self.MetaFile), 'r').readlines()
              except:
      Severity: Minor
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 2 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _DirectiveParser has a Cognitive Complexity of 17 (exceeds 5 allowed). Consider refactoring.
      Open

          def _DirectiveParser(self):
              self._ValueList = ['', '', '']
              TokenList = GetSplitValueList(self._CurrentLine, ' ', 1)
              self._ValueList[0:len(TokenList)] = TokenList
      
      Severity: Minor
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 2 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _PcdParser has a Cognitive Complexity of 15 (exceeds 5 allowed). Consider refactoring.
      Open

          def _PcdParser(self):
              TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT, 1)
              self._ValueList[0:1] = GetSplitValueList(TokenList[0], TAB_SPLIT)
              ValueRe = re.compile(r'^[a-zA-Z_][a-zA-Z0-9_]*')
              # check PCD information
      Severity: Minor
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _IncludeParser has a Cognitive Complexity of 15 (exceeds 5 allowed). Consider refactoring.
      Open

          def _IncludeParser(self):
              TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
              self._ValueList[0:len(TokenList)] = TokenList
              Macros = self._Macros
              if Macros:
      Severity: Minor
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function __RetrievePcdValue has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
      Open

          def __RetrievePcdValue(self):
              Records = self._RawTable.Query(MODEL_PCD_FEATURE_FLAG, BelongsToItem= -1.0)
              for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, ID, Line in Records:
                  Name = TokenSpaceGuid + '.' + PcdName
                  ValList, Valid, Index = AnalyzeDscPcd(Value, MODEL_PCD_FEATURE_FLAG)
      Severity: Minor
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _PcdParser has a Cognitive Complexity of 10 (exceeds 5 allowed). Consider refactoring.
      Open

          def _PcdParser(self):
              TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT, 1)
              self._ValueList[0:1] = GetSplitValueList(TokenList[0], TAB_SPLIT)
              if len(TokenList) == 2:
                  self._ValueList[2] = TokenList[1]
      Severity: Minor
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function __getitem__ has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
      Open

          def __getitem__(self, DataInfo):
              if type(DataInfo) != type(()):
                  DataInfo = (DataInfo,)
      
              # Parse the file first, if necessary
      Severity: Minor
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 55 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _PcdParser has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
      Open

          def _PcdParser(self):
              TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT, 1)
              ValueList = GetSplitValueList(TokenList[0], TAB_SPLIT)
              if len(ValueList) != 2:
                  EdkLogger.error('Parser', FORMAT_INVALID, "Illegal token space GUID and PCD name format",
      Severity: Minor
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 55 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function __ProcessDefine has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
      Open

          def __ProcessDefine(self):
              if not self._Enabled:
                  return
      
              Type, Name, Value = self._ValueList
      Severity: Minor
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 45 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Avoid deeply nested control flow statements.
      Open

                              if NextLine[0] == TAB_SECTION_START and NextLine[-1] == TAB_SECTION_END:
                                  self._CurrentLine = NmakeLine + Line[0:-1]
                                  NmakeLine = ''
                              else:
                                  NmakeLine = NmakeLine + ' ' + Line[0:-1]
      Severity: Major
      Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 45 mins to fix

        Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
        Open

            def __init__(self, FilePath, FileType, Table, Owner= -1, From= -1):
        Severity: Minor
        Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 35 mins to fix

          Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
          Open

              def __init__(self, FilePath, FileType, Table, Owner= -1, From= -1):
          Severity: Minor
          Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 35 mins to fix

            Function __ProcessPcd has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
            Open

                def __ProcessPcd(self):
                    if self._ItemType not in [MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD]:
                        self._ValueList[2] = ReplaceMacro(self._ValueList[2], self._Macros, RaiseError=True)
                        return
            
            Severity: Minor
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 35 mins to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Function _DefineParser has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
            Open

                def _DefineParser(self):
                    TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
                    self._ValueList[1:len(TokenList)] = TokenList
                    if not self._ValueList[1]:
                        EdkLogger.error('Parser', FORMAT_INVALID, "No name specified",
            Severity: Minor
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py - About 25 mins to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                    elif self._ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF:
                        # Back to the nearest !if/!ifdef/!ifndef
                        while self._DirectiveStack:
                            self._DirectiveEvalStack.pop()
                            Directive = self._DirectiveStack.pop()
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 5 days to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1283..1358

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 510.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                @ParseMacro
                def _DefineParser(self):
                    TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
                    self._ValueList[1:len(TokenList)] = TokenList
                    if not self._ValueList[1]:
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 days to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 310..334

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 302.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                @ParseMacro
                def _DefineParser(self):
                    TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
                    self._ValueList[1:len(TokenList)] = TokenList
            
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 days to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 933..952

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 257.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                def _IncludeParser(self):
                    TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
                    self._ValueList[0:len(TokenList)] = TokenList
                    Macros = self._Macros
                    if Macros:
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 days to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 552..573

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 241.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF:
                        # Remove all directives between !if and !endif, including themselves
                        while self._DirectiveStack:
                            # Remove any !else or !elseif
                            DirectiveInfo = self._DirectiveStack.pop()
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 days to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 883..909

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 235.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                @ParseMacro
                def _BinaryFileParser(self):
                    TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT, 2)
                    if len(TokenList) < 2:
                        EdkLogger.error('Parser', FORMAT_INVALID, "No file type or path specified",
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 days to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 596..612

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 227.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                @ParseMacro
                def _LibraryClassParser(self):
                    TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
                    if len(TokenList) < 2:
                        EdkLogger.error('Parser', FORMAT_INVALID, "No library class or instance specified",
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1011..1027

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 202.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                        if Line[0] == TAB_SECTION_START and Line[-1] == TAB_SECTION_END:
                            if not GetHeaderComment:
                                for Cmt, LNo in Comments:
                                    self._Store(MODEL_META_DATA_HEADER_COMMENT, Cmt, '', '', 'COMMON',
                                                'COMMON', self._Owner[-1], LNo, -1, LNo, -1, 0)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 473..514

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 170.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                def __getitem__(self, DataInfo):
                    if type(DataInfo) != type(()):
                        DataInfo = (DataInfo,)
            
                    # Parse the file first, if necessary
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 211..232

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 160.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if Type == TAB_DSC_DEFINES_DEFINE:
                        #
                        # First judge whether this DEFINE is in conditional directive statements or not.
                        #
                        if type(self) == DscParser and self._InDirective > -1:
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 67..91

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 158.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                                          MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF]:
                        Macros = self._Macros
                        Macros.update(GlobalData.gGlobalDefines)
                        try:
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1244..1261

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 153.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                @ParseMacro
                def _SourceFileParser(self):
                    TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
                    self._ValueList[0:len(TokenList)] = TokenList
                    Macros = self._Macros
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 579..590

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 152.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                DataType = {
                    TAB_SKUIDS.upper()                          :   MODEL_EFI_SKU_ID,
                    TAB_LIBRARIES.upper()                       :   MODEL_EFI_LIBRARY_INSTANCE,
                    TAB_LIBRARY_CLASSES.upper()                 :   MODEL_EFI_LIBRARY_CLASS,
                    TAB_BUILD_OPTIONS.upper()                   :   MODEL_META_DATA_BUILD_OPTION,
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 687..712

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 149.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if self._ValueList[2] != '':
                        InfPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
                        if InfPcdValueList[0] in ['True', 'true', 'TRUE']:
                            self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '1', 1);
                        elif InfPcdValueList[0] in ['False', 'false', 'FALSE']:
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 641..646

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 142.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                            if self._Version < 0x00010005:
                                if self._SectionType in [MODEL_META_DATA_BUILD_OPTION,
                                                         MODEL_EFI_LIBRARY_CLASS,
                                                         MODEL_META_DATA_PACKAGE,
                                                         MODEL_PCD_FIXED_AT_BUILD,
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 476..497

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 141.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                                          MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
                                          MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
                        self._DirectiveStack.append(self._ItemType)
                        if self._ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_IF:
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1263..1275

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 130.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                DataType = {
                    TAB_UNKNOWN.upper() : MODEL_UNKNOWN,
                    TAB_INF_DEFINES.upper() : MODEL_META_DATA_HEADER,
                    TAB_DSC_DEFINES_DEFINE : MODEL_META_DATA_DEFINE,
                    TAB_BUILD_OPTIONS.upper() : MODEL_META_DATA_BUILD_OPTION,
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 390..411

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 129.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    Processer = {
                        MODEL_META_DATA_SECTION_HEADER                  :   self.__ProcessSectionHeader,
                        MODEL_META_DATA_SUBSECTION_HEADER               :   self.__ProcessSubsectionHeader,
                        MODEL_META_DATA_HEADER                          :   self.__ProcessDefine,
                        MODEL_META_DATA_DEFINE                          :   self.__ProcessDefine,
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 day to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1069..1098

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 123.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                def _SubsectionHeaderParser(self):
                    self._SubsectionName = self._CurrentLine[1:-1].upper()
                    if self._SubsectionName in self.DataType:
                        self._SubsectionType = self.DataType[self._SubsectionName]
                    else:
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 7 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 856..864

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 113.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if TokenList[1][0] != '{' or TokenList[1][-1] != '}' or GuidStructureStringToGuidString(TokenList[1]) == '':
                        EdkLogger.error('Parser', FORMAT_INVALID, "Invalid GUID value format",
                                        ExtraData=self._CurrentLine + \
                                                  " (<CName> = <GuidValueInCFormat:{8,4,4,{2,2,2,2,2,2,2,2}}>)",
                                        File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 6 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1637..1641

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 101.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                def _GetMacros(self):
                    Macros = {}
                    Macros.update(self._FileLocalMacros)
                    Macros.update(self._GetApplicableSectionMacro())
                    Macros.update(GlobalData.gEdkGlobal)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 5 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1056..1066

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 97.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                @ParseMacro
                def _SkuIdParser(self):
                    TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
                    if len(TokenList) != 2:
                        EdkLogger.error('Parser', FORMAT_INVALID, "Correct format is '<Integer>|<UiName>'",
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 5 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 954..960

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 86.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                @ParseMacro
                def _PathParser(self):
                    TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
                    self._ValueList[0:len(TokenList)] = TokenList
                    # Don't do macro replacement for dsc file at this point
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 4 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 249..256

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 82.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 6 locations. Consider refactoring.
            Open

                    if self._ValueList[0] == '' or self._ValueList[1] == '':
                        EdkLogger.error('Parser', FORMAT_INVALID, "No token space GUID or PCD name specified",
                                        ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<PcdCName>)",
                                        File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 5 other locations - About 4 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 635..638
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 986..989
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1679..1683
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1080..1083
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1767..1771

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 76.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                @ParseMacro
                def _ComponentParser(self):
                    if self._CurrentLine[-1] == '{':
                        self._ValueList[0] = self._CurrentLine[0:-1].strip()
                        self._InSubsection = True
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 4 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1002..1008

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 76.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 6 locations. Consider refactoring.
            Open

                    if self._ValueList[0] == '' or self._ValueList[1] == '':
                        EdkLogger.error('Parser', FORMAT_INVALID, "No token space GUID or PCD name specified",
                                        ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<TokenCName>|<PcdValue>)",
                                        File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 5 other locations - About 4 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 635..638
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 986..989
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1679..1683
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 714..717
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1767..1771

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 76.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 6 locations. Consider refactoring.
            Open

                    if self._ValueList[0] == '' or self._ValueList[1] == '':
                        EdkLogger.error('Parser', FORMAT_INVALID, "No token space GUID or PCD name specified",
                                        ExtraData=self._CurrentLine + \
                                                  " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                        File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 5 other locations - About 4 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 635..638
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 986..989
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1679..1683
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 714..717
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1080..1083

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 76.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                DataType = {
                    TAB_DEC_DEFINES.upper()                     :   MODEL_META_DATA_HEADER,
                    TAB_DSC_DEFINES_DEFINE                      :   MODEL_META_DATA_DEFINE,
                    TAB_INCLUDES.upper()                        :   MODEL_EFI_INCLUDE,
                    TAB_LIBRARY_CLASSES.upper()                 :   MODEL_EFI_LIBRARY_CLASS,
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 4 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1439..1451

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 75.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if len(PtrValue) != 0:
                        ptrValueList = re.sub(ValueRe, '', TokenList[1])
                        ValueList = GetSplitValueList(ptrValueList)
                        ValueList[0] = PtrValue[0]
                    else:
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 4 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1696..1701

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 75.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                    for Arch, ModuleType in Scope:
                        self._LastItem = self._Store(
                                                ItemType,
                                                self._ValueList[0],
                                                self._ValueList[1],
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 4 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1525..1538

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 75.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                        if Line[0] == TAB_SECTION_START and Line[-1] == TAB_SECTION_END:
                            self._SectionHeaderParser()
                            self._Comments = []
                            continue
                        elif len(self._SectionType) == 0:
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1504..1510

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 73.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 4 locations. Consider refactoring.
            Open

                    if len(TokenList2) == 2:
                        self._ValueList[0] = TokenList2[0]  # toolchain family
                        self._ValueList[1] = TokenList2[1]  # keys
                    else:
                        self._ValueList[1] = TokenList[0]
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 341..345
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1037..1041
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 356..360

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 72.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 4 locations. Consider refactoring.
            Open

                    if len(TokenList2) == 2:
                        self._ValueList[0] = TokenList2[0]              # toolchain family
                        self._ValueList[1] = TokenList2[1]              # keys
                    else:
                        self._ValueList[1] = TokenList[0]
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 341..345
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1037..1041
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1146..1150

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 72.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 4 locations. Consider refactoring.
            Open

                    if self._ValueList[1].count('_') != 4:
                        EdkLogger.error(
                            'Parser',
                            FORMAT_INVALID,
                            "'%s' must be in format of <TARGET>_<TOOLCHAIN>_<ARCH>_<TOOL>_FLAGS" % self._ValueList[1],
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 349..356
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1045..1052
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1154..1161

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 71.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 4 locations. Consider refactoring.
            Open

                    if self._ValueList[1].count('_') != 4:
                        EdkLogger.error(
                            'Parser',
                            FORMAT_INVALID,
                            "'%s' must be in format of <TARGET>_<TOOLCHAIN>_<ARCH>_<TOOL>_FLAGS" % self._ValueList[1],
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 349..356
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1045..1052
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 364..371

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 71.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                        self._LastItem = self._Store(
                                            self._ItemType,
                                            self._ValueList[0],
                                            self._ValueList[1],
                                            self._ValueList[2],
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 914..927

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 71.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                            self._LastItem = self._Store(
                                                    self._ItemType,
                                                    self._ValueList[0],
                                                    self._ValueList[1],
                                                    self._ValueList[2],
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1285..1298

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 71.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if len(TokenList) < 2 or TokenList[1] == '':
                        EdkLogger.error('Parser', FORMAT_INVALID, "No PCD Datum information given",
                                        ExtraData=self._CurrentLine + \
                                                  " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                        File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1685..1689

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 70.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if DirectiveName in ['!IF', '!IFDEF', '!INCLUDE', '!IFNDEF', '!ELSEIF'] and self._ValueList[1] == '':
                        EdkLogger.error("Parser", FORMAT_INVALID, "Missing expression",
                                        File=self.MetaFile, Line=self._LineIndex + 1,
                                        ExtraData=self._CurrentLine)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 877..880

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 68.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                        if self._SectionName != '' and self._SectionName != ItemList[0].upper():
                            EdkLogger.error('Parser', FORMAT_INVALID, "Different section names in the same section",
                                            File=self.MetaFile, Line=self._LineIndex + 1, ExtraData=self._CurrentLine)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 279..281

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 68.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                        if MODEL_PCD_FEATURE_FLAG in self._SectionType and len(self._SectionType) > 1:
                            EdkLogger.error(
                                        'Parser',
                                        FORMAT_INVALID,
                                        "%s must not be in the same section of other types of PCD" % TAB_PCDS_FEATURE_FLAG_NULL,
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1593..1600

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 68.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    self._ValueList[2] = ValueList[0].strip() + '|' + ValueList[1].strip() + '|' + ValueList[2].strip()
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1739..1739

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 67.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                _SectionParser = {
                    MODEL_UNKNOWN                   :   MetaFileParser._Skip,
                    MODEL_META_DATA_HEADER          :   MetaFileParser._DefineParser,
                    MODEL_META_DATA_BUILD_OPTION    :   MetaFileParser._BuildOptionParser,
                    MODEL_EFI_INCLUDE               :   _IncludeParser, # for Edk.x modules
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 653..673

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 67.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                def _NmakeParser(self):
                    TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
                    self._ValueList[0:len(TokenList)] = TokenList
                    # remove macros
                    self._ValueList[1] = ReplaceMacro(self._ValueList[1], self._Macros)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 615..619

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 66.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                        elif Line[0] == '}' and self._InSubsection:
                            self._InSubsection = False
                            self._SubsectionType = MODEL_UNKNOWN
                            self._SubsectionName = ''
                            self._Owner[-1] = -1
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 803..815

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 65.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                    for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, ID, Line in Records:
                        Name = TokenSpaceGuid + '.' + PcdName
                        ValList, Valid, Index = AnalyzeDscPcd(Value, MODEL_PCD_FIXED_AT_BUILD)
                        self._Symbols[Name] = ValList[Index]
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1322..1325

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 64.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                    for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, ID, Line in Records:
                        Name = TokenSpaceGuid + '.' + PcdName
                        ValList, Valid, Index = AnalyzeDscPcd(Value, MODEL_PCD_FEATURE_FLAG)
                        self._Symbols[Name] = ValList[Index]
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1328..1331

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 64.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                def __new__(Class, FilePath, *args, **kwargs):
                    if FilePath in Class.MetaFiles:
                        return Class.MetaFiles[FilePath]
                    else:
                        ParserObject = super(MetaFileParser, Class).__new__(Class)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 127..133

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 63.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if ValueList[0] in ['True', 'true', 'TRUE']:
                        ValueList[0] = '1'
                    elif ValueList[0] in ['False', 'false', 'FALSE']:
                        ValueList[0] = '0'
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 3 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1734..1737

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 63.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                    if not ValueRe.match(self._ValueList[1]):
                        EdkLogger.error('Parser', FORMAT_INVALID, "The format of the PCD CName is invalid. The correct format is '(a-zA-Z_)[a-zA-Z0-9_]*'",
                                        ExtraData=self._CurrentLine + \
                                                  " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                        File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 hrs to fix
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1773..1777

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 61.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                    if not ValueRe.match(self._ValueList[0]):
                        EdkLogger.error('Parser', FORMAT_INVALID, "The format of the token space GUID CName is invalid. The correct format is '(a-zA-Z_)[a-zA-Z0-9_]*'",
                                        ExtraData=self._CurrentLine + \
                                                  " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                        File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 hrs to fix
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1779..1783

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 61.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 4 locations. Consider refactoring.
            Open

                def __ProcessSectionHeader(self):
                    self._SectionName = self._ValueList[0]
                    if self._SectionName in self.DataType:
                        self._SectionType = self.DataType[self._SectionName]
                    else:
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1175..1180
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1182..1187
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1313..1318

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 60.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 4 locations. Consider refactoring.
            Open

                def __ProcessSubsectionHeader(self):
                    self._SubsectionName = self._ValueList[0]
                    if self._SubsectionName in self.DataType:
                        self._SubsectionType = self.DataType[self._SubsectionName]
                    else:
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1175..1180
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1182..1187
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1306..1311

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 60.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 4 locations. Consider refactoring.
            Open

                    if 'COMMON' in ArchList and len(ArchList) > 1:
                        EdkLogger.error('Parser', FORMAT_INVALID, "'common' ARCH must not be used with specific ARCHs",
                                        File=self.MetaFile, Line=self._LineIndex + 1, ExtraData=self._CurrentLine)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 303..305
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1617..1619
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1725..1727

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 58.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 4 locations. Consider refactoring.
            Open

                    if 'COMMON' in ArchList and len(ArchList) > 1:
                        EdkLogger.error('Parser', FORMAT_INVALID, "'common' ARCH must not be used with specific ARCHs",
                                        File=self.MetaFile, Line=self._LineIndex + 1, ExtraData=self._CurrentLine)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 303..305
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1617..1619
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 317..319

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 58.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                def _Skip(self):
                    EdkLogger.warn("Parser", "Unrecognized content", File=self.MetaFile,
                                    Line=self._LineIndex + 1, ExtraData=self._CurrentLine);
                    self._ValueList[0:1] = [self._CurrentLine]
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 259..262

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 57.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if len(TokenList) == 2 and type(self) != DscParser: # value
                        self._ValueList[2] = ReplaceMacro(TokenList[1], self._Macros)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 346..347

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 55.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    elif self._ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF:
                        self._DirectiveStack.append(self._ItemType)
                        self._DirectiveEvalStack[-1] = not self._DirectiveEvalStack[-1]
                        self._DirectiveEvalStack.append(bool(Result))
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1276..1279

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 54.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 4 locations. Consider refactoring.
            Open

                    if len(ValueList) != 2:
                        EdkLogger.error('Parser', FORMAT_INVALID, "Illegal token space GUID and PCD name format",
                                        ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<PcdCName>)",
                                        File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 628..631
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1705..1709
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1805..1809

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 53.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                _SectionParser = {
                    MODEL_META_DATA_HEADER                          :   _DefineParser,
                    MODEL_EFI_SKU_ID                                :   _SkuIdParser,
                    MODEL_EFI_LIBRARY_INSTANCE                      :   _LibraryInstanceParser,
                    MODEL_EFI_LIBRARY_CLASS                         :   _LibraryClassParser,
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1405..1425

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 53.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 4 locations. Consider refactoring.
            Open

                    if len(ValueList) != 3:
                        EdkLogger.error('Parser', FORMAT_INVALID, "Invalid PCD Datum information given",
                                        ExtraData=self._CurrentLine + \
                                                  " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                        File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 628..631
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1705..1709
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 707..710

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 53.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if Name in GlobalData.gGlobalDefines:
                        EdkLogger.error('Parser', FORMAT_INVALID, "%s can only be defined via environment variable" % Name,
                                        ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 56..58

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 51.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                        if [S1, S2, self.DataType[self._SectionName]] not in self._Scope:
                            self._Scope.append([S1, S2, self.DataType[self._SectionName]])
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 2 hrs to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1613..1614

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 51.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                        except MacroException, Excpt:
                            EdkLogger.error('Parser', FORMAT_INVALID, str(Excpt),
                                            File=self._FileWithError, ExtraData=' '.join(self._ValueList),
                                            Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1137..1140

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 49.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                            if hasattr(Excpt, 'Pcd'):
                                if Excpt.Pcd in GlobalData.gPlatformOtherPcds:
                                    Info = GlobalData.gPlatformOtherPcds[Excpt.Pcd]
                                    EdkLogger.error('Parser', FORMAT_INVALID, "Cannot use this PCD (%s) in an expression as"
                                                    " it must be defined in a [PcdsFixedAtBuild] or [PcdsFeatureFlag] section"
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1129..1136

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 49.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if not TokenList[0]:
                        EdkLogger.error('Parser', FORMAT_INVALID, "No macro name given",
                                        ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 47..49

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 47.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if not gMacroNamePattern.match(Name):
                        EdkLogger.error('Parser', FORMAT_INVALID, "The macro name must be in the pattern [A-Z][A-Z0-9_]*",
                                        ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 60..62

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 47.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if self._DirectiveStack:
                        Type, Line, Text = self._DirectiveStack[-1]
                        EdkLogger.error('Parser', FORMAT_INVALID, "No matching '!endif' found",
                                        ExtraData=Text, File=self.MetaFile, Line=Line)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 849..852

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 46.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if DirectiveName not in self.DataType:
                        EdkLogger.error("Parser", FORMAT_INVALID, "Unknown directive [%s]" % DirectiveName,
                                        File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 874..876

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 46.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                _SectionParser = {
                    MODEL_META_DATA_HEADER          :   MetaFileParser._DefineParser,
                    MODEL_EFI_INCLUDE               :   MetaFileParser._PathParser,
                    MODEL_EFI_LIBRARY_CLASS         :   MetaFileParser._PathParser,
                    MODEL_EFI_GUID                  :   _GuidParser,
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1741..1754

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 43.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 5 locations. Consider refactoring.
            Open

                    try:
                        Content = open(str(self.MetaFile), 'r').readlines()
                    except:
                        EdkLogger.error("Parser", FILE_READ_FAILURE, ExtraData=self.MetaFile)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 4 other locations - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 435..438
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1476..1479
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 494..497
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1601..1604

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 41.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 5 locations. Consider refactoring.
            Open

                    try:
                        Content = open(str(self.MetaFile), 'r').readlines()
                    except:
                        EdkLogger.error("Parser", FILE_READ_FAILURE, ExtraData=self.MetaFile)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 4 other locations - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 435..438
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1476..1479
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 860..863
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1601..1604

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 41.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 5 locations. Consider refactoring.
            Open

                    try:
                        Content = open(str(self.MetaFile), 'r').readlines()
                    except:
                        EdkLogger.error("Parser", FILE_READ_FAILURE, ExtraData=self.MetaFile)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 4 other locations - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 435..438
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1476..1479
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 494..497
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 860..863

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 41.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if not IsValid:
                        EdkLogger.error('Parser', FORMAT_INVALID, Cause, ExtraData=self._CurrentLine,
                                        File=self.MetaFile, Line=self._LineIndex + 1)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1730..1732

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 41.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                        if self._ValueList == None or self._ItemType == MODEL_META_DATA_DEFINE:
                            self._ItemType = -1
                            self._Comments = []
                            continue
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1515..1518

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 40.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                @ParseMacro
                def _CommonParser(self):
                    TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
                    self._ValueList[0:len(TokenList)] = TokenList
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 240..243

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 39.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                def _GetMacros(self):
                    Macros = {}
                    Macros.update(self._FileLocalMacros)
                    Macros.update(self._GetApplicableSectionMacro())
                    return Macros
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 359..363

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 38.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                        if self._InSubsection and self._Owner[-1] == -1:
                            self._Owner.append(self._LastItem)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 796..797

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 38.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                def __ProcessLibraryClass(self):
                    self._ValueList[1] = ReplaceMacro(self._ValueList[1], self._Macros, RaiseError=True)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1367..1368

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 38.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if (self._ItemType == MODEL_META_DATA_HEADER) and (self._SectionType == MODEL_META_DATA_HEADER):
                        self._FileLocalMacros[Name] = Value
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 1 hr to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1237..1238

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 38.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                            if type(self) == DecParser:
                                if MODEL_META_DATA_HEADER in self._SectionType:
                                    self._FileLocalMacros[Name] = Value
                                else:
                                    self._ConstructSectionMacroDict(Name, Value)
            Severity: Minor
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 50 mins to fix
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1353..1357

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 36.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if self._ItemType == MODEL_META_DATA_DEFINE:
                        if self._SectionType == MODEL_META_DATA_HEADER:
                            self._FileLocalMacros[Name] = Value
                        else:
                            self._ConstructSectionMacroDict(Name, Value)
            Severity: Minor
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 50 mins to fix
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 75..83

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 36.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 7 locations. Consider refactoring.
            Open

                        if len(ItemList) > 1:
                            S1 = ItemList[1].upper()
                        else:
                            S1 = 'COMMON'
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 6 other locations - About 45 mins to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 290..293
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 296..299
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1603..1606
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1609..1612
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 300..303
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1717..1720

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 35.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 7 locations. Consider refactoring.
            Open

                        if len(ItemList) > 2:
                            S2 = ItemList[2].upper()
                        else:
                            S2 = 'COMMON'
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 6 other locations - About 45 mins to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 290..293
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 296..299
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1603..1606
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1609..1612
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 300..303
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1711..1714

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 35.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 7 locations. Consider refactoring.
            Open

                        if len(ItemList) > 1:
                            S1 = ItemList[1].upper()
                        else:
                            S1 = 'COMMON'
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 6 other locations - About 45 mins to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 290..293
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 296..299
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1603..1606
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1609..1612
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1711..1714
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1717..1720

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 35.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 4 locations. Consider refactoring.
            Open

                def __ProcessSkuId(self):
                    self._ValueList = [ReplaceMacro(Value, self._Macros, RaiseError=True)
                                       for Value in self._ValueList]
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 40 mins to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1360..1362
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1401..1403
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1527..1529

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 34.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 4 locations. Consider refactoring.
            Open

                def __ProcessBuildOption(self):
                    self._ValueList = [ReplaceMacro(Value, self._Macros, RaiseError=False)
                                       for Value in self._ValueList]
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 40 mins to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1360..1362
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1401..1403
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1487..1489

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 34.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 4 locations. Consider refactoring.
            Open

                def __ProcessComponent(self):
                    self._ValueList[0] = ReplaceMacro(self._ValueList[0], self._Macros)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 40 mins to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1395..1396
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1398..1399
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1524..1525

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 34.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                def _Done(self):
                    self._Finished = True
                    ## Do not set end flag when processing included files
                    if self._From == -1:
                        self._Table.SetEndFlag()
            Severity: Minor
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 40 mins to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 190..194

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 34.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 4 locations. Consider refactoring.
            Open

                def __ProcessSourceOverridePath(self):
                    self._ValueList[0] = ReplaceMacro(self._ValueList[0], self._Macros)
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 40 mins to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1395..1396
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1398..1399
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1521..1522

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 34.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 4 locations. Consider refactoring.
            Open

                    if len(TokenList) == 2:
                        self._ValueList[2] = TokenList[1]
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 35 mins to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 984..985
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1042..1043
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1151..1152

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 33.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 4 locations. Consider refactoring.
            Open

                    if len(TokenList) == 2:                 # value
                        self._ValueList[2] = TokenList[1]
            Severity: Major
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 3 other locations - About 35 mins to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 984..985
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 1042..1043
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1078..1079

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 33.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 2 locations. Consider refactoring.
            Open

                    if len(TokenList) > 1:
                        self._ValueList[2] = TokenList[1]
            Severity: Minor
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 1 other location - About 35 mins to fix
            BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py on lines 633..634

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 33.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 3 locations. Consider refactoring.
            Open

                            if self._SectionName not in SECTIONS_HAVE_ITEM_AFTER_ARCH and len(ItemList) > 3:
                                EdkLogger.error("Parser", FORMAT_UNKNOWN_ERROR, "%s is not a valid section name" % Item,
                                                self.MetaFile, self._LineIndex + 1, self._CurrentLine)
            Severity: Minor
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 2 other locations - About 30 mins to fix
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 293..295
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1694..1699

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 32.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 3 locations. Consider refactoring.
            Open

                        elif self._Version >= 0x00010005:
                            EdkLogger.error("Parser", FORMAT_UNKNOWN_ERROR, "%s is not a valid section name" % Item,
                                            self.MetaFile, self._LineIndex + 1, self._CurrentLine)
            Severity: Minor
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 2 other locations - About 30 mins to fix
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 290..292
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 1694..1699

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 32.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Identical blocks of code found in 3 locations. Consider refactoring.
            Open

                        if self._SectionName in self.DataType:
                            if self.DataType[self._SectionName] not in self._SectionType:
                                self._SectionType.append(self.DataType[self._SectionName])
                        else:
                            EdkLogger.error("Parser", FORMAT_UNKNOWN_ERROR, "%s is not a valid section name" % Item,
            Severity: Minor
            Found in BaseTools/Source/Python/Workspace/MetaFileParser.py and 2 other locations - About 30 mins to fix
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 290..292
            BaseTools/Source/Python/Workspace/MetaFileParser.py on lines 293..295

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 32.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            There are no issues that match your filters.

            Category
            Status