usaspending_api/common/helpers/spark_helpers.py
File spark_helpers.py
has 473 lines of code (exceeds 250 allowed). Consider refactoring. Wontfix
Wontfix
"""
Boilerplate helper functions for setup and configuration of the Spark environment
NOTE: This is distinguished from the usaspending_api.common.etl.spark module, which holds Spark utility functions that
could be used as stages or steps of an ETL job (aka "data pipeline")
Cyclomatic complexity is too high in function configure_spark_session. (28) Wontfix
Wontfix
def configure_spark_session(
java_gateway: JavaGateway = None,
spark_context: Union[SparkContext, SparkSession] = None,
master=None,
app_name="Spark App",
- Read upRead up
- Exclude checks
Cyclomatic Complexity
Cyclomatic Complexity corresponds to the number of decisions a block of code contains plus 1. This number (also called McCabe number) is equal to the number of linearly independent paths through the code. This number can be used as a guide when testing conditional logic in blocks.
Radon analyzes the AST tree of a Python program to compute Cyclomatic Complexity. Statements have the following effects on Cyclomatic Complexity:
Construct | Effect on CC | Reasoning |
---|---|---|
if | +1 | An if statement is a single decision. |
elif | +1 | The elif statement adds another decision. |
else | +0 | The else statement does not cause a new decision. The decision is at the if. |
for | +1 | There is a decision at the start of the loop. |
while | +1 | There is a decision at the while statement. |
except | +1 | Each except branch adds a new conditional path of execution. |
finally | +0 | The finally block is unconditionally executed. |
with | +1 | The with statement roughly corresponds to a try/except block (see PEP 343 for details). |
assert | +1 | The assert statement internally roughly equals a conditional statement. |
Comprehension | +1 | A list/set/dict comprehension of generator expression is equivalent to a for loop. |
Boolean Operator | +1 | Every boolean operator (and, or) adds a decision point. |
Function configure_spark_session
has 9 arguments (exceeds 6 allowed). Consider refactoring. Wontfix
Wontfix
def configure_spark_session(
Function configure_spark_session
has a Cognitive Complexity of 17 (exceeds 15 allowed). Consider refactoring. Wontfix
Wontfix
def configure_spark_session(
java_gateway: JavaGateway = None,
spark_context: Union[SparkContext, SparkSession] = None,
master=None,
app_name="Spark App",
- Read upRead up
Cognitive Complexity
Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.
A method's cognitive complexity is based on a few simple rules:
- Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
- Code is considered more complex for each "break in the linear flow of the code"
- Code is considered more complex when "flow breaking structures are nested"