tensorflow/models

View on GitHub
official/nlp/data/classifier_data_lib.py

Summary

Maintainability
F
6 days
Test Coverage

File classifier_data_lib.py has 1329 lines of code (exceeds 250 allowed). Consider refactoring.
Open

# Copyright 2024 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
Severity: Major
Found in official/nlp/data/classifier_data_lib.py - About 3 days to fix

    Function _create_examples has a Cognitive Complexity of 17 (exceeds 5 allowed). Consider refactoring.
    Open

      def _create_examples(self, split_name, set_type):
        """Creates examples for the training/dev/test sets."""
        if split_name not in self.dataset:
          raise ValueError("Split {} not available.".format(split_name))
        dataset = self.dataset[split_name].as_numpy_iterator()
    Severity: Minor
    Found in official/nlp/data/classifier_data_lib.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function get_train_examples has a Cognitive Complexity of 16 (exceeds 5 allowed). Consider refactoring.
    Open

      def get_train_examples(self, data_dir):
        """See base class."""
        lines = self._read_tsv(os.path.join(data_dir, "train-en.tsv"))
    
        examples = []
    Severity: Minor
    Found in official/nlp/data/classifier_data_lib.py - About 2 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function convert_single_example has a Cognitive Complexity of 15 (exceeds 5 allowed). Consider refactoring.
    Open

    def convert_single_example(ex_index, example, label_list, max_seq_length,
                               tokenizer):
      """Converts a single `InputExample` into a single `InputFeatures`."""
      label_map = {}
      if label_list:
    Severity: Minor
    Found in official/nlp/data/classifier_data_lib.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function file_based_convert_examples_to_features has a Cognitive Complexity of 15 (exceeds 5 allowed). Consider refactoring.
    Open

    def file_based_convert_examples_to_features(examples,
                                                label_list,
                                                max_seq_length,
                                                tokenizer,
                                                output_file,
    Severity: Minor
    Found in official/nlp/data/classifier_data_lib.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function generate_tf_record_from_data_file has a Cognitive Complexity of 14 (exceeds 5 allowed). Consider refactoring.
    Open

    def generate_tf_record_from_data_file(processor,
                                          data_dir,
                                          tokenizer,
                                          train_data_output_path=None,
                                          eval_data_output_path=None,
    Severity: Minor
    Found in official/nlp/data/classifier_data_lib.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function get_dev_examples has a Cognitive Complexity of 13 (exceeds 5 allowed). Consider refactoring.
    Open

      def get_dev_examples(self, data_dir):
        """See base class."""
        examples = []
        if self.only_use_en_dev:
          lines = self._read_tsv(os.path.join(data_dir, "dev-en.tsv"))
    Severity: Minor
    Found in official/nlp/data/classifier_data_lib.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function get_test_examples has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
    Open

      def get_test_examples(self, data_dir):
        """See base class."""
        examples_by_lang = {}
        for lang in self.supported_languages:
          examples_by_lang[lang] = []
    Severity: Minor
    Found in official/nlp/data/classifier_data_lib.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function get_test_examples has a Cognitive Complexity of 12 (exceeds 5 allowed). Consider refactoring.
    Open

      def get_test_examples(self, data_dir):
        """See base class."""
        examples_by_lang = {}
        for lang in self.supported_languages:
          examples_by_lang[lang] = []
    Severity: Minor
    Found in official/nlp/data/classifier_data_lib.py - About 1 hr to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function featurize_example has 34 lines of code (exceeds 25 allowed). Consider refactoring.
    Open

      def featurize_example(self, ex_index, example, label_list, max_seq_length,
                            tokenizer):
        """Here we concate sentence1, sentence2, word together with [SEP] tokens."""
        del label_list
        tokens_a = tokenizer.tokenize(example.text_a)
    Severity: Minor
    Found in official/nlp/data/classifier_data_lib.py - About 1 hr to fix

      Function convert_single_example has 28 lines of code (exceeds 25 allowed). Consider refactoring.
      Open

      def convert_single_example(ex_index, example, label_list, max_seq_length,
                                 tokenizer):
        """Converts a single `InputExample` into a single `InputFeatures`."""
        label_map = {}
        if label_list:
      Severity: Minor
      Found in official/nlp/data/classifier_data_lib.py - About 1 hr to fix

        Function get_train_examples has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
        Open

          def get_train_examples(self, data_dir):
            """See base class."""
            examples = []
            if self.translated_data_dir is None:
              lines = self._read_tsv(os.path.join(data_dir, "train-en.tsv"))
        Severity: Minor
        Found in official/nlp/data/classifier_data_lib.py - About 55 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function get_dev_examples has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
        Open

          def get_dev_examples(self, data_dir):
            """See base class."""
            examples = []
            if self.only_use_en_dev:
              lines = self._read_tsv(os.path.join(data_dir, "dev-en.tsv"))
        Severity: Minor
        Found in official/nlp/data/classifier_data_lib.py - About 55 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function _create_examples has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
        Open

          def _create_examples(self, data_dir):
            """Creates examples."""
            examples = []
            for label in ["neg", "pos"]:
              cur_dir = os.path.join(data_dir, label)
        Severity: Minor
        Found in official/nlp/data/classifier_data_lib.py - About 55 mins to fix

        Cognitive Complexity

        Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

        A method's cognitive complexity is based on a few simple rules:

        • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
        • Code is considered more complex for each "break in the linear flow of the code"
        • Code is considered more complex when "flow breaking structures are nested"

        Further reading

        Function __init__ has 7 arguments (exceeds 4 allowed). Consider refactoring.
        Open

          def __init__(self,
        Severity: Major
        Found in official/nlp/data/classifier_data_lib.py - About 50 mins to fix

          Function file_based_convert_examples_to_features has 7 arguments (exceeds 4 allowed). Consider refactoring.
          Open

          def file_based_convert_examples_to_features(examples,
          Severity: Major
          Found in official/nlp/data/classifier_data_lib.py - About 50 mins to fix

            Function generate_tf_record_from_data_file has 7 arguments (exceeds 4 allowed). Consider refactoring.
            Open

            def generate_tf_record_from_data_file(processor,
            Severity: Major
            Found in official/nlp/data/classifier_data_lib.py - About 50 mins to fix

              Function __init__ has 7 arguments (exceeds 4 allowed). Consider refactoring.
              Open

                def __init__(self,
              Severity: Major
              Found in official/nlp/data/classifier_data_lib.py - About 50 mins to fix

                Function __init__ has 6 arguments (exceeds 4 allowed). Consider refactoring.
                Open

                  def __init__(self,
                Severity: Minor
                Found in official/nlp/data/classifier_data_lib.py - About 45 mins to fix

                  Function convert_single_example has 5 arguments (exceeds 4 allowed). Consider refactoring.
                  Open

                  def convert_single_example(ex_index, example, label_list, max_seq_length,
                  Severity: Minor
                  Found in official/nlp/data/classifier_data_lib.py - About 35 mins to fix

                    Function featurize_example has 5 arguments (exceeds 4 allowed). Consider refactoring.
                    Open

                      def featurize_example(self, ex_index, example, label_list, max_seq_length,
                    Severity: Minor
                    Found in official/nlp/data/classifier_data_lib.py - About 35 mins to fix

                      Function _truncate_seq_pair has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                      Open

                      def _truncate_seq_pair(tokens_a, tokens_b, max_length):
                        """Truncates a sequence pair in place to the maximum length."""
                      
                        # This is a simple heuristic which will always truncate the longer sequence
                        # one token at a time. This makes more sense than truncating an equal percent
                      Severity: Minor
                      Found in official/nlp/data/classifier_data_lib.py - About 25 mins to fix

                      Cognitive Complexity

                      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                      A method's cognitive complexity is based on a few simple rules:

                      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                      • Code is considered more complex for each "break in the linear flow of the code"
                      • Code is considered more complex when "flow breaking structures are nested"

                      Further reading

                      There are no issues that match your filters.

                      Category
                      Status