Showing 96 of 96 total issues
File builders.py
has 583 lines of code (exceeds 250 allowed). Consider refactoring. Open
# SPDX-FileCopyrightText: Copyright 2020-2023, Contributors to typed-dfs
# SPDX-PackageHomePage: https://github.com/dmyersturnbull/typed-dfs
# SPDX-License-Identifier: Apache-2.0
"""
Defines a builder pattern for ``TypedDf``.
File file_formats.py
has 569 lines of code (exceeds 250 allowed). Consider refactoring. Open
# SPDX-FileCopyrightText: Copyright 2020-2023, Contributors to typed-dfs
# SPDX-PackageHomePage: https://github.com/dmyersturnbull/typed-dfs
# SPDX-License-Identifier: Apache-2.0
"""
File formats for reading/writing to/from DFs.
File checksum_models.py
has 454 lines of code (exceeds 250 allowed). Consider refactoring. Open
# SPDX-FileCopyrightText: Copyright 2020-2023, Contributors to typed-dfs
# SPDX-PackageHomePage: https://github.com/dmyersturnbull/typed-dfs
# SPDX-License-Identifier: Apache-2.0
"""
Models for shasum-like files.
_RetypeMixin
has 45 functions (exceeds 20 allowed). Consider refactoring. Open
class _RetypeMixin:
def __add__(self, other):
x = super().__add__(other)
return self._change_if_df(x)
File df_typing.py
has 378 lines of code (exceeds 250 allowed). Consider refactoring. Open
# SPDX-FileCopyrightText: Copyright 2020-2023, Contributors to typed-dfs
# SPDX-PackageHomePage: https://github.com/dmyersturnbull/typed-dfs
# SPDX-License-Identifier: Apache-2.0
"""
Information about how DataFrame subclasses should be handled.
File _ini_like_mixin.py
has 351 lines of code (exceeds 250 allowed). Consider refactoring. Open
# SPDX-FileCopyrightText: Copyright 2020-2023, Contributors to typed-dfs
# SPDX-PackageHomePage: https://github.com/dmyersturnbull/typed-dfs
# SPDX-License-Identifier: Apache-2.0
"""
Mixin for INI, .properties, and TOML.
Function write_file
has a Cognitive Complexity of 26 (exceeds 5 allowed). Consider refactoring. Open
def write_file(
self,
path: Path | str,
*,
overwrite: bool = True,
- Read upRead up
Cognitive Complexity
Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.
A method's cognitive complexity is based on a few simple rules:
- Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
- Code is considered more complex for each "break in the linear flow of the code"
- Code is considered more complex when "flow breaking structures are nested"
Further reading
Function describe_dtype
has a Cognitive Complexity of 25 (exceeds 5 allowed). Consider refactoring. Open
def describe_dtype(cls, t: type[Any], *, short: bool = False) -> str | None:
"""
Returns a string name for a Pandas-supported dtype.
Args:
- Read upRead up
Cognitive Complexity
Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.
A method's cognitive complexity is based on a few simple rules:
- Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
- Code is considered more complex for each "break in the linear flow of the code"
- Code is considered more complex when "flow breaking structures are nested"
Further reading
File df_errors.py
has 307 lines of code (exceeds 250 allowed). Consider refactoring. Open
# SPDX-FileCopyrightText: Copyright 2020-2023, Contributors to typed-dfs
# SPDX-PackageHomePage: https://github.com/dmyersturnbull/typed-dfs
# SPDX-License-Identifier: Apache-2.0
"""
Exceptions used by typeddfs.
Function write_any
has a Cognitive Complexity of 20 (exceeds 5 allowed). Consider refactoring. Open
def write_any(
self,
path: PathLike,
*,
to_file: bool,
- Read upRead up
Cognitive Complexity
Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.
A method's cognitive complexity is based on a few simple rules:
- Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
- Code is considered more complex for each "break in the linear flow of the code"
- Code is considered more complex when "flow breaking structures are nested"
Further reading
FileFormat
has 24 functions (exceeds 20 allowed). Consider refactoring. Open
class FileFormat(_Enum):
"""
A computer-readable format for reading **and** writing of DataFrames in typeddfs.
This includes CSV, Parquet, ODT, etc. Some formats also include compressed variants.
E.g. a ".csg.gz" will map to ``FileFormat.csv``.
Function convert
has a Cognitive Complexity of 19 (exceeds 5 allowed). Consider refactoring. Open
def convert(cls, df: pd.DataFrame) -> __qualname__:
"""
Converts a vanilla Pandas DataFrame (or any subclass) to ``cls``.
Explicitly sets the new copy's __class__ to cls.
Rearranges the columns and index names.
- Read upRead up
Cognitive Complexity
Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.
A method's cognitive complexity is based on a few simple rules:
- Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
- Code is considered more complex for each "break in the linear flow of the code"
- Code is considered more complex when "flow breaking structures are nested"
Further reading
Function _read_properties_like
has a Cognitive Complexity of 19 (exceeds 5 allowed). Consider refactoring. Open
def _read_properties_like(
cls,
unescape_keys,
unescape_values,
comment_chars: set[str],
- Read upRead up
Cognitive Complexity
Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.
A method's cognitive complexity is based on a few simple rules:
- Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
- Code is considered more complex for each "break in the linear flow of the code"
- Code is considered more complex when "flow breaking structures are nested"
Further reading
DfTyping
has 22 functions (exceeds 20 allowed). Consider refactoring. Open
class DfTyping:
"""
Contains all information about how to type a DataFrame subclass.
"""
Function to_fwf
has a Cognitive Complexity of 17 (exceeds 5 allowed). Consider refactoring. Open
def to_fwf(
self,
path_or_buff=None,
mode: str = "w",
colspecs: Sequence[tuple[int, int]] | None = None,
- Read upRead up
Cognitive Complexity
Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.
A method's cognitive complexity is based on a few simple rules:
- Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
- Code is considered more complex for each "break in the linear flow of the code"
- Code is considered more complex when "flow breaking structures are nested"
Further reading
File abs_dfs.py
has 258 lines of code (exceeds 250 allowed). Consider refactoring. Open
# SPDX-FileCopyrightText: Copyright 2020-2023, Contributors to typed-dfs
# SPDX-PackageHomePage: https://github.com/dmyersturnbull/typed-dfs
# SPDX-License-Identifier: Apache-2.0
"""
Defines a low-level DataFrame subclass.
IoTyping
has 21 functions (exceeds 20 allowed). Consider refactoring. Open
class IoTyping(Generic[T_co]):
_hash_alg: str | None = "sha256"
_save_hash_file: bool = False
_save_hash_dir: bool = False
_remap_suffixes: Mapping[str, FileFormat] | None = None
File frozen_types.py
has 255 lines of code (exceeds 250 allowed). Consider refactoring. Open
# SPDX-License-Identifier Apache-2.0
# Source: https://github.com/dmyersturnbull/typed-dfs
#
"""
Hashable and ordered collections.
Function of
has a Cognitive Complexity of 16 (exceeds 5 allowed). Consider refactoring. Open
def of(cls, df, *args, keys: Iterable[str] | None = None, **kwargs) -> __qualname__:
"""
Construct or convert a DataFrame, returning this type.
Delegates to :meth:`convert` for DataFrames,
or tries first constructing a DataFrame by calling ``pd.DataFrame(df)``.
- Read upRead up
Cognitive Complexity
Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.
A method's cognitive complexity is based on a few simple rules:
- Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
- Code is considered more complex for each "break in the linear flow of the code"
- Code is considered more complex when "flow breaking structures are nested"
Further reading
File matrix_dfs.py
has 253 lines of code (exceeds 250 allowed). Consider refactoring. Open
# SPDX-FileCopyrightText: Copyright 2020-2023, Contributors to typed-dfs
# SPDX-PackageHomePage: https://github.com/dmyersturnbull/typed-dfs
# SPDX-License-Identifier: Apache-2.0
"""
DataFrames that are essentially n-by-m matrices.