cuebook/CueObserve

View on GitHub

Showing 53 of 54 total issues

Similar blocks of code found in 2 locations. Consider refactoring.
Open

        migrations.AddField(
            model_name='connectionparam',
            name='connectionType',
            field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='connectionTypeParam', to='anomaly.connectiontype'),
Severity: Minor
Found in api/anomaly/migrations/0001_initial.py and 1 other location - About 55 mins to fix
api/anomaly/migrations/0001_initial.py on lines 67..70

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 37.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Similar blocks of code found in 2 locations. Consider refactoring.
Open

    def get_dimensions(self, obj):
        dimensions = json.loads(obj.dimensions) if obj.metrics else []
        return dimensions if dimensions else []
Severity: Minor
Found in api/anomaly/serializers.py and 1 other location - About 55 mins to fix
api/anomaly/serializers.py on lines 130..132

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 37.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Function createRCAAnomaly has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
Open

    def createRCAAnomaly(
        anomalyId: int, dimension: str, dimensionValue: str, contriPercent: float, df
    ):
        """
        Create RCA Anomaly for given anomalyId, dimension, dimensionValue
Severity: Minor
Found in api/anomaly/services/rootCauseAnalyses.py - About 55 mins to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Similar blocks of code found in 2 locations. Consider refactoring.
Open

        minRow = anomalies[anomalies["y"] == anomalies.y.min()].iloc[-1]
Severity: Minor
Found in api/ops/tasks/detection/core/detectionTypes/lifetime.py and 1 other location - About 55 mins to fix
api/ops/tasks/detection/core/detectionTypes/lifetime.py on lines 15..15

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 37.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Function prepareAnomalyDataframes has 7 arguments (exceeds 4 allowed). Consider refactoring.
Open

def prepareAnomalyDataframes(
Severity: Major
Found in api/access/utils.py - About 50 mins to fix

    Function addAnomalyDefinition has 7 arguments (exceeds 4 allowed). Consider refactoring.
    Open

        def addAnomalyDefinition(
    Severity: Major
    Found in api/anomaly/services/anomalyDefinitions.py - About 50 mins to fix

      Function runQueryOnConnection has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
      Open

          def runQueryOnConnection(connectionType, connectionParams, query, limit=True):
              dataframe = None
              if connectionType == "BigQuery":
                  params = connectionParams
                  dataframe = BigQuery.fetchDataframe(params, query, limit=limit)
      Severity: Minor
      Found in api/access/data.py - About 45 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function post has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
      Open

          def post(self, request):
                  """For new login"""
                  res = {"message": "Some error occured", "success": False}
                  if request.method == "POST":
                      body = json.loads(request.body)
      Severity: Minor
      Found in api/users/views.py - About 45 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function anomalyService has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
      Open

      def anomalyService(dimValObj, dfDict, anomalyDefProps, detectionRuleType, detectionParams):
          """
          Method to conduct the anomaly detection process
          """
          df = pd.DataFrame(dfDict)
      Severity: Minor
      Found in api/ops/tasks/detection/core/anomalyDetection.py - About 45 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function slackAlertHelper has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
      Open

          def slackAlertHelper(title, message, name, details="", anomalyId: int = None):
              """
              Helper method for slackAlert
              """
              token = ""
      Severity: Minor
      Found in api/anomaly/services/alerts.py - About 45 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Identical blocks of code found in 3 locations. Consider refactoring.
      Open

          todayISO = today.replace(hour=0, minute=0, second=0, microsecond=0, tzinfo=None).isoformat()[:19]
      Severity: Major
      Found in api/ops/tasks/detection/core/detectionTypes/lifetime.py and 2 other locations - About 45 mins to fix
      api/ops/tasks/detection/core/detectionTypes/percentageChange.py on lines 34..34
      api/ops/tasks/detection/core/detectionTypes/prophet.py on lines 46..46

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 35.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Function event_logs has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
      Open

      def event_logs(anomalyDef_id,status, publishedCount, totalCount):
          """Event logs on anomaly definition Run"""
          userId = "UnIdentified"
          try:
              # userObject = InstallationTable.objects.all()[0]
      Severity: Minor
      Found in api/anomaly/services/telemetry.py - About 45 mins to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Identical blocks of code found in 3 locations. Consider refactoring.
      Open

          todayISO = today.replace(hour=0, minute=0, second=0, microsecond=0, tzinfo=None).isoformat()[:19]
      api/ops/tasks/detection/core/detectionTypes/lifetime.py on lines 39..39
      api/ops/tasks/detection/core/detectionTypes/prophet.py on lines 46..46

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 35.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Identical blocks of code found in 3 locations. Consider refactoring.
      Open

          todayISO = today.replace(
      Severity: Major
      Found in api/ops/tasks/detection/core/detectionTypes/prophet.py and 2 other locations - About 45 mins to fix
      api/ops/tasks/detection/core/detectionTypes/lifetime.py on lines 39..39
      api/ops/tasks/detection/core/detectionTypes/percentageChange.py on lines 34..34

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 35.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Function contributionOnDimensionalValues has 5 arguments (exceeds 4 allowed). Consider refactoring.
      Open

      def contributionOnDimensionalValues(
      Severity: Minor
      Found in api/access/utils.py - About 35 mins to fix

        Function cueObserveAnomalyAlert has 5 arguments (exceeds 4 allowed). Consider refactoring.
        Open

            def cueObserveAnomalyAlert(token, channelId, fileImg, title="", message="", details=""):
        Severity: Minor
        Found in alerts-api/src/alerts.py - About 35 mins to fix

          Function detect has 5 arguments (exceeds 4 allowed). Consider refactoring.
          Open

          def detect(df, granularity, detectionRuleType, detectionParams, limit=None):
          Severity: Minor
          Found in api/ops/tasks/detection/core/anomalyDetection.py - About 35 mins to fix

            Function valueThresholdDetect has 5 arguments (exceeds 4 allowed). Consider refactoring.
            Open

            def valueThresholdDetect(df, granularity, operator, value1, value2):
            Severity: Minor
            Found in api/ops/tasks/detection/core/detectionTypes/valueThreshold.py - About 35 mins to fix

              Function _anomalyDetectionForValue has 5 arguments (exceeds 4 allowed). Consider refactoring.
              Open

              def _anomalyDetectionForValue(
              Severity: Minor
              Found in api/ops/tasks/rootCauseAnalysis.py - About 35 mins to fix

                Function topNDimensionalValues has 5 arguments (exceeds 4 allowed). Consider refactoring.
                Open

                def topNDimensionalValues(
                Severity: Minor
                Found in api/access/utils.py - About 35 mins to fix
                  Severity
                  Category
                  Status
                  Source
                  Language