Posts Tagged ‘javascript’

Re-Blog: Testing JavaScript on Various Platforms with Karma and Sauce Labs

November 6th, 2014 by Bill McGee

Thanks to Ben Ripkins for this great blog post on codecentric!

See an excerpt below.

The web can be a brutal environment with various combinations of browsers and operating systems (platforms). It is quite likely that your continuous integration setup only covers a small portion of your users’ platforms. The unfortunate truth is that testing on these platforms is necessary to accommodate for compliance differences and partial support of standards and technologies like HTML, JavaScript, CSS and network protocols.

Sauce Labs is a service that can be used to test web applications by simulating user behaviour or executing JavaScript unit tests. Sauce Labs eases testing by supporting 442 different platform combinations and additionally recording the test execution. This means that you can actually see what a user might have seen and therefore trace errors easily.

For the purpose of this blog post we will cover the following setup:

  • Sources and tests written using CommonJS modules. Modules are transpiled for the web using Browserify.
  • Mocha as our test framework.
  • Local test execution in the headless PhantomJS browser.
  • Testing against the following platforms in our CI environment:
    • Chrome 35 on Windows 7,
    • Firefox 30 on an arbitrary operating system,
    • iPhone 7.1 on OS X 10.9 and
    • Internet Explorer on Windows 8.1.

(more…)

Announcing Sauce Integration With Siesta

September 16th, 2014 by Bill McGee

siesta-logoSauce is thrilled to announce that we’ve integrated with Siesta by Bryntum!

Siesta is a JavaScript unit testing tool that can help you test any JavaScript code and also perform testing of the DOM and simulate user interactions. The tool can be used together with any type of JavaScript codebase – jQuery, Ext JS, NodeJS, Dojo, YUI etc. Using the API, you can choose from many types of assertions ranging from simple logical JS object comparisons to verifying that an HTML element is visible in the DOM. It comes in two versions: Lite and Standard. With Lite you can easily test your JavaScript in the browser, and with Standard you can also automate your tests (highly recommended).

Sauce is a cloud-based tool that enables you to securely test your web and mobile apps across 385+ browser/OS/platform/device combinations.

How does Siesta work with Sauce?

(more…)

Adding Custom Methods to Data Models with Angular $resource

July 24th, 2014 by Bill McGee

Sauce Labs software developer Alan Christopher Thomas and his team have been hard at work updating our stack. He shared with us some insight into their revised dev process, so we thought we’d show off what he’s done. Read his follow-up post below.

Thanks for your great feedback to this post. Previously we examined three different approaches to modeling data in AngularJS. We’ve since incorporated some of your feedback, so we wanted to share that information here. You can also see updates we made in our original post.

One of our commenters made mention of a cleaner approach to adding custom methods to $resource models when our API response response allows it, using angular.extend().

In this implementation, we’re imagining an API response that looks like this:

[
  {
    "breakpointed": null,
    "browser": "android",
    "browser_short_version": "4.3",
    ...
  },
  {
    ...
  }
  ...
]

Each of the response objects in the list is a “Job” that contains a whole lot of metadata about an individual job that’s been run in the Sauce cloud.

We want to be able to iterate over the jobs to build a list for our users, showing the outcome of each: “Pass,” “Fail,” etc.

Our template looks something like this:

{{ job.getResult() }} {{ job.name }}

Note the job.getResult() call. In order to get this convenience, however, we need to be able to attach a getResult() method to each Job returned in the response.

So, here’s what the model looks like, using Angular $resource:

angular.module('job.models', [])
    .factory('Job', ['$resource', function($resource) {
        var Job = $resource('/api/jobs/:jobId', {
            full: 'true',
            jobId: '@id'
        });

        angular.extend(Job.prototype, {
            getResult: function() {
                if (this.status == 'complete') {
                    if (this.passed === null) return "Finished";
                    else if (this.passed === true) return "Pass";
                    else if (this.passed === false) return "Fail";
                }
                else return "Running";
            }
        });

        return Job;
    }]);

Note that since each resulting object returned by $resource is a Job object itself, we can simply extend Job.prototype to include the behavior we want for every individual job instance.

Then, our controller looks like this (revised from the original post to make use of the not-so-obvious promise):

angular.module('job.controllers', [])
    .controller('jobsController', ['$scope', '$http', 'Job', function($scope, $http, Job) {
        $scope.loadJobs = function() {
            $scope.isLoading = true;
            var jobs = Job.query().$promise.then(function(jobs) {
                $scope.jobs = jobs;
            });
        };

        $scope.loadJobs();
    }]);

The simplicity of this example makes $resource a much more attractive option for our team’s data-modeling needs, especially considering that for simple applications, custom behavior isn’t incredibly unwieldy to implement.

– Alan Christopher Thomas, Software Developer, Sauce Labs

AngularJS Data Models: $http VS $resource VS Restangular

July 15th, 2014 by Bill McGee

Sauce Labs software developer Alan Christopher Thomas and his team have been hard at work updating our stack. He shared with us some insight into their dev process, so we thought we’d show off what he’s done. Read his post below.

Over the past few months, the Sauce Labs web team has fixed its crosshairs on several bits of our stack that needed to be refreshed. One of those bits is the list of jobs all customers see when they first log into their account. It looks like this:

stack

Our current app is built in Backbone.js. We vetted lots of options for frontend MVC frameworks and data binding that could replace and simplify the existing Backbone code: Ember, Angular, React, Rivets, Stapes, etc.

After lots of research, building some stuff, and personal preference, our team decided we were most comfortable with Angular.

We had one more thing we wanted to verify, though, before settling.

How complicated will it be to model our data?

This was the first question on most of our minds, and it was the one question about Angular that Google fell into a void of silence. Backbone has models and collections. Ember.js has Ember Data and ember-model. Stapes has extensible observable objects that can function as collections. But what about Angular? Most examples we found were extremely thin on the data layer, just returning simple JavaScript objects and assigning them directly to a $scope model.

So, we built a small proof of concept using three different AngularJS data modeling techniques. This is a dumbed down version of our Jobs page, which only displays a list of jobs and their results. Our only basic requirement was that we kept business logic out of our controllers so they wouldn’t become bloated.

We gave ourselves some flexibility with the API responses and allowed them to be wrapped with an object or not wrapped to emphasize the strengths of each approach. However, all calls require limit and full parameters to be passed in the GET query string.

Here’s what we wanted the resulting DOM template to look like:

{{ job.getResult() }} {{ job.name }}

Note that each resulting job should be able to have a getResult() method that displays a human-readable outcome in a badge. The rendered page looks like this:

jobs

The Code: $http vs $resource vs Restangular

So, here’s the resulting code for all three approaches, each implementing a getResult() method on every job.

$http

In this approach, we created a service that made the API calls and wrapped each result as a Job() object with a getResult() method defined on the prototype.

API Response Format:

{
  "meta": {}, 
  "objects": [
    {
      "breakpointed": null, 
      "browser": "android", 
      "browser_short_version": "4.3",
      ...
    },
    {
      ...
    },
    ...
  ]
}

models.js:

angular.module('job.models', [])
    .service('JobManager', ['$q', '$http', 'Job', function($q, $http, Job) {
        return {
            getAll: function(limit) {
                var deferred = $q.defer();

                $http.get('/api/jobs?limit=' + limit + '&full=true').success(function(data) {
                    var jobs = [];
                    for (var i = 0; i < data.objects.length; i ++) {
                        jobs.push(new Job(data.objects[i]));
                    }
                    deferred.resolve(jobs);
                });

                return deferred.promise;
            }
        };
    }])
    .factory('Job', function() {
        function Job(data) {
            for (attr in data) {
                if (data.hasOwnProperty(attr))
                    this[attr] = data[attr];
            }
        }

        Job.prototype.getResult = function() {
            if (this.status == 'complete') {
                if (this.passed === null) return "Finished";
                else if (this.passed === true) return "Pass";
                else if (this.passed === false) return "Fail";
            }
            else return "Running";
        };

        return Job;
    });

controllers.js:

angular.module('job.controllers', [])
    .controller('jobsController', ['$scope', 'JobManager', function($scope, JobManager) {
        var limit = 20;
        $scope.loadJobs = function() {
            JobManager.getAll(limit).then(function(jobs) {
                $scope.jobs = jobs;
                limit += 10;
            });
        };

        $scope.loadJobs();
    }]);

This approach made for a pretty simple controller, but since we needed a custom method on the model, our services and factories quickly became verbose. Also, if we were to abstract away this behavior to apply to other data types (sub-accounts, tunnels, etc.), we might end up writing a whole lot of boilerplate.

$resource

UPDATE: Per Micke’s suggestion in the comments section below, we’ve posted a follow-up with a cleaner implementation of the $resource version of the Job model. It parses an API response similar to the one shown in the Restangular scenario and allows for much cleaner method declaration usingangular.extend.

Angular provides its own $resource factory, which has to be included in your project as a separate dependency. It takes away some of the pain we felt in writing our JobManager service boilerplate code and allows us to apply our custom method directly to the $resource prototype, then transform responses to be wrapped in itself.

API Response Format:

{
  "items": [
    {
      "breakpointed": null, 
      "browser": "android", 
      "browser_short_version": "4.3", 
      ...
    }, 
    {
      ...
    }
    ...
  ]
}

models.js:

angular.module('job.models', [])
    .factory('Job', ['$resource', function($resource) {
        var Job = $resource('/api/jobs/:jobId', { full: 'true', jobId: '@id' }, {
            query: {
                method: 'GET',
                isArray: false,
                transformResponse: function(data, header) {
                    var wrapped = angular.fromJson(data);
                    angular.forEach(wrapped.items, function(item, idx) {
                        wrapped.items[idx] = new Job(item);
                    });
                    return wrapped;
                }
            }
        });

        Job.prototype.getResult = function() {
            if (this.status == 'complete') {
                if (this.passed === null) return "Finished";
                else if (this.passed === true) return "Pass";
                else if (this.passed === false) return "Fail";
            }
            else return "Running";
        };

        return Job;
    }]);

controllers.js:

angular.module('job.controllers', [])
    .controller('jobsController', ['$scope', 'Job', function($scope, Job) {
        var limit = 20;
        $scope.loadJobs = function() {
            var jobs = Job.query({ limit: limit }, function(jobs) {
                $scope.jobs = jobs.items;
                limit += 10;
            });
        };

        $scope.loadJobs();
    }]);

This approach also makes for a pretty elegant controller, except we really didn’t like that the query() methodtook a callback instead of giving us a promise didn’t return a promise directly, but gave us an object with the promise in a $promise attribute (thanks Louis!). It felt pretty un-Angular a little ugly. Also, the process of transforming result objects and wrapping them felt like a strange dance to achieve some simple behavior (UPDATE: see this post). We’d probably end up writing more boilerplate to abstract that part away.

Restangular

Last, but not least, we gave Restangular a shot. Restangular is a third-party library that attempts to abstract away pain points of dealing with API responses, reduce boilerplate, and do it in the most Angular-y way possible.

API Response Format:

[
  {
    "breakpointed": null, 
    "browser": "android", 
    "browser_short_version": "4.3", 
    ...
  }, 
  {
    ...
  }
  ...
]

models.js:

angular.module('job.models', [])
  .service('Job', ['Restangular', function(Restangular) {
    var Job = Restangular.service('jobs');

    Restangular.extendModel('jobs', function(model) {
      model.getResult = function() {
        if (this.status == 'complete') {
          if (this.passed === null) return "Finished";
          else if (this.passed === true) return "Pass";
          else if (this.passed === false) return "Fail";
        }
        else return "Running";
      };

      return model;
    });

    return Job;
  }]);

controllers.js:

angular.module('job.controllers', [])
  .controller('jobsController', ['$scope', 'Job', function($scope, Job) {
    var limit = 20;
    $scope.loadJobs = function() {
      Job.getList({ full: true, limit: limit }).then(function(jobs) {
        $scope.jobs = jobs;
        limit += 10;
      });
    };
$scope.loadJobs();
}]);

In this one, we got to cheat and use Restangular.service(), which provides all the RESTful goodies for us. It even abstracted away writing out full URLs for our API calls. Restangular.extendModel() gives us an elegant way to attach methods to each of our model results, making getResult() straightforward and readable. Lastly, the call in our controller returns a promise! This let us write the controller logic a bit more cleanly and allows us to be more flexible with the response in the future.

tldr; Concluding Thoughts

Each of the three approaches have their appropriate use cases, but I think in ours we’re leaning toward Restangular.

$http – $http is built into Angular, so there’s no need for the extra overhead of loading in an external dependency. $http is good for quick retrieval of server-side data that doesn’t really need any specific structure or complex behaviors. It’s probably best injected directly into your controllers for simplicity’s sake.

$resource – $resouce is good for situations that are slightly more complex than $http. It’s good when you have pretty structured data, but you plan to do most of your crunching, relationships, and other operations on the server side before delivering the API response. $resource doesn’t let you do much once you get the data into your JavaScript app, so you should deliver it to the app in its final state and make more REST calls when you need to manipulate or look at it from a different angle. Any custom behavior on the client side will need a lot of boilerplate.

Restangular – Restangular is a perfect option for complex operations on the client side. It lets you easily attach custom behaviors and interact with your data in much the same way as other model paradigms you’ve used in the past. It’s promise-based, clean, and feature-rich. However, it might be overkill if your needs are basic, and it carries along with it any extra implications that come with bringing in additional third-party dependencies.

Restangular seems to be a decently active project with the prospect of a 2.0 that’s compatible with Angular 2.0, currently a private repository. However, a lot of the project’s progress seem to be dependent on the work of a single developer for the time being.

We’re looking forward to seeing how Restangular progresses and whether or not it seems like a good fit for us at Sauce! If this blog post has piqued your interest and you’ll feel as passionate about web development as we do feel free to check out our career opportunities here at Sauce or send us a note.

Alan Christopher Thomas, Software Developer, Sauce Labs

Re-Blog: JavaScript Multi Module Project – Continuous Integration

June 11th, 2014 by Bill McGee

lubos-krnacOur friend Lubos Krnac describes how to integrate Sauce with Protractor in a quest to implement continuous integration in his JavaScript multi module project with Grunt.

Below is a quote from his most recent blog post along side some code.

Read the rest of his post to get the full how-to here.

An important part of this setup is Protractor integration with Sauce Labs. Sauce Labs provides a Selenium server with WebDiver API for testing. Protractor uses Sauce Labs by default when you specify their credentials. Credentials are the only special configuration in test/protractor/protractorConf.js (bottom of the snippet). The other configuration was taken from the grunt-protractor-coverage example. I am using this Grunt plug-in for running Protractor tests and measuring code coverage.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
// A reference configuration file.
exports.config = {
  // ----- What tests to run -----
  //
  // Spec patterns are relative to the location of this config.
  specs: [
    'test/protractor/*Spec.js'
  ],
  // ----- Capabilities to be passed to the webdriver instance ----
  //
  // For a full list of available capabilities, see
  // and
  capabilities: {
    'browserName': 'chrome'
    //  'browserName': 'firefox'
    //  'browserName': 'phantomjs'
  },
  params: {
  },
  // ----- More information for your tests ----
  //
  // A base URL for your application under test. Calls to protractor.get()
  // with relative paths will be prepended with this.
  baseUrl: 'http://localhost:3000/',
  // Options to be passed to Jasmine-node.
  jasmineNodeOpts: {
    showColors: true, // Use colors in the command line report.
    isVerbose: true, // List all tests in the console
    includeStackTrace: true,
    defaultTimeoutInterval: 90000
  },
  
  sauceUser: process.env.SAUCE_USERNAME,
  sauceKey: process.env.SAUCE_ACCESS_KEY
};

You may ask “how can I use localhost in the configuration, when a remote selenium server is used for testing?” Good question. Sauce Labs provides a very useful feature called Sauce Connect. It is a tunnel that emulates access to your machine from a Selenium server. This is super useful when you need to bypass company firewall. It will be used later in main project CI configuration.

Have an idea for a blog post, webinar, or more? We want to hear from you! Submit topic ideas (or questions!) here.

Javascript + Selenium: The Rockstar Combination of Testing

July 25th, 2011 by Ashley Wilson

For our July Selenium meetup, held last Thursday, we wanted to give attendees something a little different to chew on. Thanks to our good friends at Yammer, who co-hosted the event with us, we did so not only with delicious catered Mexican food, but also plenty of Javascript & Selenium testing goodness to go around.

Bob Remeika, senior engineer at Yammer, gave a spirited presentation that left no one questioning his stance on testing (his opening slide – “Test your shit” – really said it all). He gave us an inside look at how Yammer tests using a combination of Jellyfish and Sauce OnDemand, and gave some great advice on knowing what and how to test when you’re just starting out.

 

We also had Adam Christian, Sauce Labs’ Javascript Aficionado and the creator of Jellyfish, give two talks. The first, a lightning talk titled “Javascript Via Selenium: The Good, The Bad, The Obvious”, covered some of the lesser known things about Javascript testing via Selenium.

The second showed off how you can use Jellyfish, the open source Javascript runner that he announced a few weeks ago, to run your JS unit tests in any environment.

Thanks to Adam, Bob, and Yammer for making this quite the fun and memorable meetup. As always, the San Francisco Selenium Meetup group is free to join & we meet monthly at different venues around the Bay Area to talk all things testing. See you in August!

JavaScript Unit Testing with Jellyfish and OnDemand

June 20th, 2011 by Adam Christian

Jellyfish logo

Web application testing turns out to be very similar to an onion. You keep peeling back layers and finding more testing that can, and probably should, be done. But you shouldn’t only be concerned with testing the smooth outer layer that your users will see.  There’s still plenty of automated testing to be done on those inner layers, and it doesn’t require much extra work.

The testing I’m referring to is the JavaScript unit tests. For most development teams, this is an incredibly important step and should be done well before using Selenium for testing web UI and user flows.

Most people think about Selenium as the “click” here, “type” there project. And for many intents and purposes, that’s correct. However, Selenium 2 is offering some new functionality that lends itself to a slightly bigger set of use cases. What I’m talking about is the ‘execute’ method. This allows you to execute arbitrary JavaScript in the AUT (application under test) scope, without having to deal with the frustrating complexities of ‘getEval’ in Selenium 1.

Specifically because of this functionality, I wrote a driver for Jellyfish (jelly.io), a framework that allows you to write node .js scripts that run JavaScript in the browser. In order to do this, I implemented a tiny subset of the WebDriver JSON wire protocol in node.js and called it ‘wd‘. These projects are interesting all in their own right, but to get straight to the point: If you have tests in FooUnit, QUnit or any other JavaScript testing framework that run in the browser, we now have a way for you to do all of that in the Sauce OnDemand cloud.

jelly_integration

For this example, lets use QUnit. In the hopes of covering all the bases, lets assume your QUnit tests are hosted on an internal web site to your company. Lets also assume that the way you currently run them is by opening a browser window and going to ‘file open’, or navigating to a local URL and sitting there watching the tests run. When they are done, you’ll manually look through the tests and results to decide if you broke things or not.

With Jellyfish and OnDemand, the way this now works is that you have a node.js script that runs after every check-in (perhaps by Jenkins) using different browser and platform combinations. When the tests are complete, an email will be sent to all your developers each time you have a failure.

architecture

So how do we do this?

Start by installing Jellyfish, the directions for which are front and center at jelly.io.

Next we will build a straightforward Jellyfish script that will start a Sauce Labs session, navigate to your QUnit tests URL, and assert the results when they finish.

For my examples, I am going to be using the full ‘JQuery Test Suite’ that I got from ‘jquery.com’.

Lets talk about the contents of the script one piece at a time:

var jellyfish = require('jellyfish')  
  , assert = require('assert');

 

This imports the jellyfish package and the assert module, which is a nice lightweight way to make assertions in node.js.

var sauce = jellyfish.createSauce();
sauce.opts.platform = "VISTA";
sauce.opts.browserName = "iexplorer";
sauce.opts.version = "9";

 

This instantiates a new Jellyfish object and lets it know that this one will be talking to Sauce Labs. The username and password for your Sauce Labs account can be passed through here directly, but in this case, I’m storing it in my ~/.jfrc file –, which is nice for simplicity. Next we’ll specify the platform, browser, and version we want to use to run our test (more information about .jfrc can be found here).

sauce.on('result', function(res) {  
  console.log(sauce.name + ' : '+sauce.tid + ' - \x1b[33m%s\x1b[0m', JSON.stringify(res));
});

 

This one is optional, but isn’t it nice to get well-formatted output in your logs?

sauce.on('complete', function(res) {  
  console.log(sauce.name + ' : '+sauce.tid + ' - \x1b[33m%s\x1b[0m', JSON.stringify(res));
  sauce.js("window.testOutput", function(o) {    
    try {      
      assert.equal(o.result.failed, 0);      
      sauce.stop(function() {      
        process.exit(0);    
      })  
    }    
    catch(e) {      
      console.log(e);      
      sauce.stop(function() {      
        process.exit(1);      
      })    
    }  
  })
});

 

This listener is very important since generally our communication with Selenium on Sauce OnDemand is one directional. However, that has been abstracted away. During the script, Jellyfish will poll Selenium every few seconds and ask ‘has window.jfComplete been set to true’? If so, an event will be triggered called ‘complete’. In the next section, you will see how the results are set, but at this point, I just have Jellyfish ask the page for the contents of ‘window.testOutput’. Next we assert the value of the number of failures. In this case, I am catching that exception so that I can print it to the terminal and exit the script with the correct return code for CI.

sauce.go("http://myurl.com/jquery/test/index.html")  
  .js("QUnit.done = function(res) {
    window.testOutput=res;
    window.jfComplete=true;
   }");

 

This is where the magic happens. This code navigates to the URL of the QUnit tests that start automatically, and as soon as we have navigated there, I redefine the QUnit done event to set the window.jfComplete flag. Then I set the results output of the test run in a place I can access it in the above code.

Usually the first question asked here is how to access QUnit tests that are on an internal network. Fortunately, we have a product at Sauce Labs called Sauce Connect that allows our cloud’s VMs to securely access whatever your test machines access on the internal network. It is incredibly simple to setup and you can do so by following the comprehensive documentation.

After that tunnel is running (you can just leave it running), you can now update the start URL in ‘sauce.go’ to reflect the URL you use internally to access your tests in the browser.

Jenkins has hooks for essentially any deployment platform or source control repository, and it also has post-deploy hooks that can be configured to send you an email. You can also use either a ‘matrix project’ or environment variables to make one job run your test on all of our available platform and browser combinations.

The example script can be downloaded here from github and I also made one of my Jellyfish QUnit test runs public for your viewing pleasure. Happy testing!