MrPowers/spark-daria

View on GitHub

Showing 12 of 21 total issues

functions has 30 methods (exceeds 20 allowed). Consider refactoring.
Open

object functions {

  //////////////////////////////////////////////////////////////////////////////////////////////
  // String functions
  //////////////////////////////////////////////////////////////////////////////////////////////
Severity: Minor
Found in src/main/scala/com/github/mrpowers/spark/daria/sql/functions.scala - About 3 hrs to fix

    Function trans has 39 lines of code (exceeds 25 allowed). Consider refactoring.
    Open

        def trans(customTransform: CustomTransform): DataFrame = {
          // make sure df doesn't already have the columns that will be added
          if (df.columns.toSeq.exists((c: String) => customTransform.addedColumns.contains(c))) {
            throw DataFrameColumnsException(
              s"The DataFrame already contains the columns your transformation will add. The DataFrame has these columns: [${df.columns

      Function bucketFinder has 31 lines of code (exceeds 25 allowed). Consider refactoring.
      Open

        ): Column = {
      
          val inclusiveBoundriesCol = lit(inclusiveBoundries)
          val lowerBoundLteCol      = lit(lowestBoundLte)
          val upperBoundGteCol      = lit(highestBoundGte)
      Severity: Minor
      Found in src/main/scala/com/github/mrpowers/spark/daria/sql/functions.scala - About 1 hr to fix

        Function run has 29 lines of code (exceeds 25 allowed). Consider refactoring.
        Open

          def run(): Unit = {
        
            lazy val spark: SparkSession = {
              SparkSession
                .builder()

          Method writeThenMerge has 6 arguments (exceeds 4 allowed). Consider refactoring.
          Open

                df: DataFrame,
                format: String = "csv",                // csv, parquet
                sc: SparkContext,                      // pass in spark.sparkContext
                tmpFolder: String,                     // will be deleted, so make sure it doesn't already exist
                filename: String,                      // the full filename you want outputted
          Severity: Minor
          Found in src/main/scala/com/github/mrpowers/spark/daria/sql/DariaWriters.scala - About 45 mins to fix

            Method writeSingleFile has 6 arguments (exceeds 4 allowed). Consider refactoring.
            Open

                  df: DataFrame,             // must be small
                  format: String = "csv",    // csv, parquet
                  sc: SparkContext,          // pass in spark.sparkContext
                  tmpFolder: String,         // will be deleted, so make sure it doesn't already exist
                  filename: String,          // the full filename you want outputted
            Severity: Minor
            Found in src/main/scala/com/github/mrpowers/spark/daria/sql/DariaWriters.scala - About 45 mins to fix

              Method withColBucket has 6 arguments (exceeds 4 allowed). Consider refactoring.
              Open

                    colName: String,
                    outputColName: String,
                    buckets: Array[(Any, Any)],
                    inclusiveBoundries: Boolean = false,
                    lowestBoundLte: Boolean = false,
              Severity: Minor
              Found in src/main/scala/com/github/mrpowers/spark/daria/sql/transformations.scala - About 45 mins to fix

                Constructor has 5 arguments (exceeds 4 allowed). Consider refactoring.
                Open

                    transform: (DataFrame => DataFrame),
                    requiredColumns: Seq[String] = Seq.empty[String],
                    addedColumns: Seq[String] = Seq.empty[String],
                    removedColumns: Seq[String] = Seq.empty[String],
                    skipWhenPossible: Boolean = true
                Severity: Minor
                Found in src/main/scala/com/github/mrpowers/spark/daria/sql/CustomTransform.scala - About 35 mins to fix

                  Method bucketFinder has 5 arguments (exceeds 4 allowed). Consider refactoring.
                  Open

                        col: Column,
                        buckets: Array[(Any, Any)],
                        inclusiveBoundries: Boolean = false,
                        lowestBoundLte: Boolean = false,
                        highestBoundGte: Boolean = false
                  Severity: Minor
                  Found in src/main/scala/com/github/mrpowers/spark/daria/sql/functions.scala - About 35 mins to fix

                    Function dariaCopyMerge has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
                    Open

                      def dariaCopyMerge(
                          srcPath: String,
                          dstPath: String,
                          sc: SparkContext,
                          deleteSource: Boolean = true
                    Severity: Minor
                    Found in src/main/scala/com/github/mrpowers/spark/daria/hadoop/FsHelpers.scala - About 35 mins to fix

                    Cognitive Complexity

                    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                    A method's cognitive complexity is based on a few simple rules:

                    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                    • Code is considered more complex for each "break in the linear flow of the code"
                    • Code is considered more complex when "flow breaking structures are nested"

                    Further reading

                    Function isLuhn has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                    Open

                      private[sql] def isLuhn(str: String): Option[Boolean] = {
                        val s = Option(str).getOrElse(return None)
                        if (s.isEmpty()) {
                          return Some(false)
                        }
                    Severity: Minor
                    Found in src/main/scala/com/github/mrpowers/spark/daria/sql/functions.scala - About 25 mins to fix

                    Cognitive Complexity

                    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                    A method's cognitive complexity is based on a few simple rules:

                    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                    • Code is considered more complex for each "break in the linear flow of the code"
                    • Code is considered more complex when "flow breaking structures are nested"

                    Further reading

                    Function customEquals has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                    Open

                      def customEquals(s1: StructField, s2: StructField, ignoreNullable: Boolean = false): Boolean = {
                        if (ignoreNullable) {
                          s1.name == s2.name &&
                          s1.dataType == s2.dataType
                        } else {

                    Cognitive Complexity

                    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                    A method's cognitive complexity is based on a few simple rules:

                    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                    • Code is considered more complex for each "break in the linear flow of the code"
                    • Code is considered more complex when "flow breaking structures are nested"

                    Further reading

                    Severity
                    Category
                    Status
                    Source
                    Language