WenjieDu/TSDB

View on GitHub
tsdb/loading_funcs/ucr_uea_datasets.py

Summary

Maintainability
C
1 day
Test Coverage

Function _load_arff_uea has a Cognitive Complexity of 32 (exceeds 5 allowed). Consider refactoring.
Open

def _load_arff_uea(
    full_file_path_and_name,
    replace_missing_vals_with="NaN",
):
    """Load data from a classification/regression WEKA arff file to a 3D np array.
Severity: Minor
Found in tsdb/loading_funcs/ucr_uea_datasets.py - About 4 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

File ucr_uea_datasets.py has 291 lines of code (exceeds 250 allowed). Consider refactoring.
Open

"""
Scripts related to UCR&UAE datasets http://timeseriesclassification.com/index.php

Most of code comes from library tslearn https://github.com/tslearn-team/tslearn.

Severity: Minor
Found in tsdb/loading_funcs/ucr_uea_datasets.py - About 3 hrs to fix

    Avoid deeply nested control flow statements.
    Open

                            for c in range(len(channels)):
                                split = channels[c].split(",")
                                inst[c] = np.array([float(i) for i in split])
                        else:
    Severity: Major
    Found in tsdb/loading_funcs/ucr_uea_datasets.py - About 45 mins to fix

      Function to_time_series_dataset has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
      Open

      def to_time_series_dataset(dataset, dtype=float):
          """Transforms a time series dataset so that it fits the format used in
          ``tslearn`` models.
      
          Parameters
      Severity: Minor
      Found in tsdb/loading_funcs/ucr_uea_datasets.py - About 45 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Avoid deeply nested control flow statements.
      Open

                          if is_multi_variate:
                              line, class_val = line.split("',")
                              class_val_list.append(class_val.strip())
                              channels = line.split("\\n")
                              channels[0] = channels[0].replace("'", "")
      Severity: Major
      Found in tsdb/loading_funcs/ucr_uea_datasets.py - About 45 mins to fix

        Avoid deeply nested control flow statements.
        Open

                                if is_first_case:
                                    is_first_case = False
                                    n_timepoints = len(line_parts) - 1
                                class_val_list.append(line_parts[-1].strip())
        Severity: Major
        Found in tsdb/loading_funcs/ucr_uea_datasets.py - About 45 mins to fix

          There are no issues that match your filters.

          Category
          Status