ContentMine/thresher

View on GitHub
lib/scraper.js

Summary

Maintainability
F
3 days
Test Coverage

Function validate has a Cognitive Complexity of 43 (exceeds 5 allowed). Consider refactoring.
Open

Scraper.prototype.validate = function(def){
  var problems = [];
  // url key must exist
  if (!def.url) {
    problems.push('must have "url" key');
Severity: Minor
Found in lib/scraper.js - About 6 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function scrapeElement has a Cognitive Complexity of 23 (exceeds 5 allowed). Consider refactoring.
Open

Scraper.prototype.scrapeElement = function(doc, element, scrapeUrl, key, follow_url) {
  var scraper = this;
  follow_url = typeof follow_url !== 'undefined' ? follow_url : false;
  // extract element
  key = key || element.name;
Severity: Minor
Found in lib/scraper.js - About 3 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

File scraper.js has 283 lines of code (exceeds 250 allowed). Consider refactoring.
Open

// # Scraper
//
// > Scraper class in the Node.js Thresher package.
// >
// > author: [Richard Smith-Unna](http://blahah/net)
Severity: Minor
Found in lib/scraper.js - About 2 hrs to fix

    Function scrapeElement has 46 lines of code (exceeds 25 allowed). Consider refactoring.
    Open

    Scraper.prototype.scrapeElement = function(doc, element, scrapeUrl, key, follow_url) {
      var scraper = this;
      follow_url = typeof follow_url !== 'undefined' ? follow_url : false;
      // extract element
      key = key || element.name;
    Severity: Minor
    Found in lib/scraper.js - About 1 hr to fix

      Function validate has 43 lines of code (exceeds 25 allowed). Consider refactoring.
      Open

      Scraper.prototype.validate = function(def){
        var problems = [];
        // url key must exist
        if (!def.url) {
          problems.push('must have "url" key');
      Severity: Minor
      Found in lib/scraper.js - About 1 hr to fix

        Function scrapeUrl has 31 lines of code (exceeds 25 allowed). Consider refactoring.
        Open

        Scraper.prototype.scrapeUrl = function(theUrl, node) {
          var scraper = this;
          scraper.startTicker();
          scraper.results = scraper.results || {};
          node = node || scraper.tree.root;
        Severity: Minor
        Found in lib/scraper.js - About 1 hr to fix

          Function Scraper has 28 lines of code (exceeds 25 allowed). Consider refactoring.
          Open

          var Scraper = function(definition, headless) {
            var scraper = this;
          
            EventEmitter2.call(this, {
              wildcard: true,
          Severity: Minor
          Found in lib/scraper.js - About 1 hr to fix

            Function runRegex has a Cognitive Complexity of 9 (exceeds 5 allowed). Consider refactoring.
            Open

            Scraper.prototype.runRegex = function(string, regex) {
              var re;
              if (regex instanceof Object) {
                if (regex.flags) {
                  var flags = regex.flags.join('');
            Severity: Minor
            Found in lib/scraper.js - About 55 mins to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Avoid deeply nested control flow statements.
            Open

                        if (keywords.indexOf(j) == -1) {
                          // this element has child[ren]
                          isLeaf = false;
                          checkLeaves(e);
                        }
            Severity: Major
            Found in lib/scraper.js - About 45 mins to fix

              Avoid deeply nested control flow statements.
              Open

                          if (!e.selector) {
                            problems.push('element ' + k + ' has no selector');
                          }
              Severity: Major
              Found in lib/scraper.js - About 45 mins to fix

                Function getChildElements has a Cognitive Complexity of 8 (exceeds 5 allowed). Consider refactoring.
                Open

                function getChildElements(obj) {
                  var elementsArray = [];
                  // process followables first, they
                  // will be excluded from results later
                  if (obj.followables) {
                Severity: Minor
                Found in lib/scraper.js - About 45 mins to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                Function scrapeElement has 5 arguments (exceeds 4 allowed). Consider refactoring.
                Open

                Scraper.prototype.scrapeElement = function(doc, element, scrapeUrl, key, follow_url) {
                Severity: Minor
                Found in lib/scraper.js - About 35 mins to fix

                  Function Scraper has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
                  Open

                  var Scraper = function(definition, headless) {
                    var scraper = this;
                  
                    EventEmitter2.call(this, {
                      wildcard: true,
                  Severity: Minor
                  Found in lib/scraper.js - About 35 mins to fix

                  Cognitive Complexity

                  Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                  A method's cognitive complexity is based on a few simple rules:

                  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                  • Code is considered more complex for each "break in the linear flow of the code"
                  • Code is considered more complex when "flow breaking structures are nested"

                  Further reading

                  Function fillChildResults has a Cognitive Complexity of 6 (exceeds 5 allowed). Consider refactoring.
                  Open

                  function fillChildResults(scraper, obj, newRes) {
                    var baseKeys = ['selector', 'attribute', 'download', 'regex', 'follow', 'name'];
                    for (var key in obj) {
                      if (baseKeys.indexOf(key) >= 0) {
                        // ignore base keys
                  Severity: Minor
                  Found in lib/scraper.js - About 25 mins to fix

                  Cognitive Complexity

                  Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                  A method's cognitive complexity is based on a few simple rules:

                  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                  • Code is considered more complex for each "break in the linear flow of the code"
                  • Code is considered more complex when "flow breaking structures are nested"

                  Further reading

                  Similar blocks of code found in 2 locations. Consider refactoring.
                  Open

                    if (obj.elements) {
                      for (var key in obj.elements) {
                        var element = obj.elements[key];
                        element.name = key;
                        elementsArray.push(element);
                  Severity: Major
                  Found in lib/scraper.js and 1 other location - About 1 hr to fix
                  lib/scraper.js on lines 188..195

                  Duplicated Code

                  Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                  Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                  When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                  Tuning

                  This issue has a mass of 73.

                  We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                  The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                  If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                  See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                  Refactorings

                  Further Reading

                  Similar blocks of code found in 2 locations. Consider refactoring.
                  Open

                    if (obj.followables) {
                      for (var key in obj.followables) {
                        var element = obj.followables[key];
                        element.name = key;
                        elementsArray.push(element);
                  Severity: Major
                  Found in lib/scraper.js and 1 other location - About 1 hr to fix
                  lib/scraper.js on lines 196..203

                  Duplicated Code

                  Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                  Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                  When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                  Tuning

                  This issue has a mass of 73.

                  We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                  The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                  If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                  See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                  Refactorings

                  Further Reading

                  There are no issues that match your filters.

                  Category
                  Status