Guest post: Proving that an application is as broken as intended

June 25th, 2015 by Björn Kimminich

Typically you want to use end-to-end (e2e) tests to prove that everything works as intended in a realistic environment. In the Juice Shop application that idea is changed to the contrary. Here the main purpose of the e2e test suite is to prove that the application is as broken as intended!

Juice Shop: Broken beyond hope – but on purpose!

“WTF?” you might ask, and rightfully so. Juice Shop is a special kind of application. It is an intentionally insecure Javascript web application designed to be used during security trainings, classes, workshops or awareness demos. It contains over 25 vulnerabilities that an aspiring hacker can exploit in order to fulfill challenges that are tracked on a scoreboard.

The job of the e2e test suite is twofold:

  1. It ensures that the overall functionality (e.g., logging in, placing products in the basket, submitting an order, etc.) of the application is working. This is the above mentioned typical use case for e2e tests.
  2. It performs attacks on the application that should solve all the existing challenges. This includes SQL Injections, Cross-Site Scripting) attacks, business logic error exploits and many more.

When does Juice Shop pass its e2e test suite? When it is working fine for the average nice user and all challenges are solvable, so an attacker can get a 100% on the scoreboard! (more…)

Repost: Angular + Protractor + Sauce Connect, Launched From Gulp, All Behind A Corporate Firewall!

March 9th, 2015 by Bill McGee

This post comes from our friend Stephen Wylie, who is using Sauce Connect to work with his corporate firewall.  Check out the original post on his blog.

You didn’t think it could be done, did you?

Well, let me prove you wrong!  First, some terms:
  • AngularJS: An MVC framework for JavaScript, allowing you to write web apps without relying on JQuery.
  • Protractor: A test harness for Angular apps.
  • Sauce Labs: A company providing cloud services to help you run E2E testing on your web app in any environment combination.
  • Node.js: Package manager for JavaScript.  You’ll need this for installing all the dependencies to get this tool set working.
  • Gulp: A build manager and mundane-task automator.  Competitor with the older, well-entrenched Grunt, but gaining popularity by the hour.  Uses JavaScript syntax, but could theoretically be used as a Makefile or shell script replacement.

The Basic Premise

My organization is writing web apps using Angular.  Long before I joined, they selected Gulp to manage application tasks such as allowing it to run on localhost at port 8888 for development & unit test purposes.  They also selected Protractor as a test harness to interact with the web app.  Protractor depends on the presence of Angular in order to work properly, and provides the use of Selenium WebDriver (for interacting with browsers) and unchained promises (a JavaScript construct to avoid callback functions).

 

Sauce Labs has been selected as the testing tool of choice because it saves us from having to set aside a massive amount of infrastructure to run tests on multiple platforms.  Through the configuration file for Protractor, I can specify exactly what OS platform & browser combination I want the test to run on.  Of course, being an organization such as it is, they also have a corporate firewall in place that will prevent the VMs at Sauce Labs from accessing development & test deployments of our web apps under construction under normal circumstances.  This is where Sauce Connect comes in: it provides a secure mechanism for the external Sauce Labs VMs to acquire the data that the server would serve to you as if you were inside the corporate firewall.  Winful for everybody!  The best part is that Sauce Labs is free for open-source projects.

Journey Through the Forest: Wiring All This Together

It is, truthfully, “stupid simple” to set up a Gulp task that will run Protractor tests through the Sauce Connect mechanism.  All you need in your Protractor configuration file is:

[code language=”ruby”]exports.config = {
sauceUser: "your login name",
sauceKey: "the GUID provided to you on your dashboard on Sauce’s site",
specs: ["the files to run as your test"],
sauceSeleniumAddress: "this is optional: default is ondemand.saucelabs.com:80/wd/hub, but localhost:4445/wd/hub is also valid for when you’re running sc locally and ondemand doesn’t work",
capabilities: {
‘tunnel-identifier’: ‘I will explain this later’,
‘browserName’: "enter your browser of choice here"
}
}[/code]

(Note that where it says “:4445” above should be replaced by the port number specified by the sc binary if it says anything different.) It’s so simple that you don’t even need any “require()”s in the config file. And in your Gulpfile, all you need is this:

[code language=”ruby”]gulp.task(‘sauce-test’, function() {
gulp.src(‘same as your "specs" from above, for the most part (unless your working directory is different)’)
.pipe1)protractor({
configFile: ‘path to the config file I described above’
}
.on(‘error’, function (e) {
throw e;
}).on(‘end’, function() {
// anything you want to run after the Sauce tests finish
}));
});[/code]

Then, of course, you can run your tests by writing “gulp sauce-test” on the command line set to the same directory as the Gulpfile.  However, proper functioning of this configuration eluded me for a long time because I did not know the Sauce Connect binary (“sc” / “sc.exe”)was supposed to be running on my machine.  I thought the binary was running on another machine in the organization, or on ondemand.saucelabs.com, and all I needed to do was set the settings in the Gulpfile to the instance of sc that’s remote (with the SauceSeleniumAddress entry).  While I could point the SauceSeleniumAddress to a different host, it was a flawed assumption on my part that anyone else in my organization was running “sc” already.  Also, ondemand.saucelabs.com might not answer the problem because it doesn’t provide the services in “sc” by itself.  It is most convenient to run sc on your own system.

This configuration issue stymied me so much that I actually played with Grunt and several plugins therein before realizing that running tests through Sauce Connect was even possible through JavaScript to any extent.  Ultimately, I found a Node plugin for Grunt called “grunt-mocha-webdriver” that proved to me this was possible, and even doable in Gulp with Protractor and Selenium-WebDriver like I want, as opposed to Grunt/Mocha/WD.js.  (By the way, blessings to jmreidy, since he also wrote the sauce-tunnel which is relied upon heavily in this tutorial.)

Nevertheless, the easiest way to run Sauce Connect on your own system is to install the “sauce-tunnel” package through npm, the Node Package Manager (visit https://www.npmjs.com/ for other hilarious things “npm” could stand for :-P).  This is, of course, achievable by running the following on the command line:

npm install sauce-tunnel

If sauce-tunnel is already in your node_modules directory, then good for you!  Otherwise, you could run this in any directory that “npm” is recognized as a valid command, but you might want to place this module strategically; the best place to put it will be revealed below.  Nevertheless, you need to traverse to the directory where sc is located; this depends on what OS you are running, as the sauce-connect package contains binaries for Mac OSX (Darwin), Linux 32/64-bit, and Windows.  So, run the “sc” executable for your given platform before you run the Gulp task specified above, or else Gulp will appear to time out (ETIMEDOUT) when it’s trying to get into Sauce Connect.

 

The minimum options you need for sc are your Sauce login name and your Sauce key (the GUID as specified above).  There are more options you can include, such as proxy configurations, as specified in the Sauce Connect documentation.  (Note that the tunnel-identifier, as called out in the Protractor config file, can be specified as an argument to sc.)

In simple terms, here’s what we have thus far:

[assuming you’ve set up all the Node packages]:

vendor/[platform]/bin$ sc -u -k [-i ] [other options]

gulp-workingdir$ gulp sauce-test

This will set up “sc” for as long as your computer is hooked up to the Internet, and will run the Sauce tests on the existing tunnel.  The tunnel will remain active until you disconnect your computer from the Internet or end the sc process, but the tests running through Gulp will set up & tear down a Selenium WebDriver that’ll drive the UI on your web app.

Help!  The test did not see a new command for 90 seconds, and is timing out!!!

If you are seeing this message, you might be behind a corporate proxy that is not letting your request go straight through to the Sauce servers.  Protractor has in its “runner.js” file a section where it will pick a specific DriverProvider based on certain settings you provide in the configuration file, and by providing the “sauceUser” and “sauceKey” values, it will pick the “sauce” DriverProvider.  The sauce DriverProvider provides an “updateJob” function that communicates with Sauce Labs (via an HTTP PUT request) on the status of the job.  This function is supposed to run after the tests conclude, and if that HTTP request fails, then the Gulp task will not end properly; thus, you will see this message.  Your list of tests in your Sauce Connect dashboard will look like this:
Screen Shot 2015-03-05 at 7.06.50 PM
This message is so severe in Sauce that it doesn’t just show up as “Fail,” it shows up as “Error”.  It also wastes a bunch of execution time, as seen in the picture above, and will obscure the fact that all the test cases actually passed (as they did in the pictured case above).  If you see this message after it is apparent that there are no more commands to be run as part of the test, then it is probably a proxy issue which is easy to resolve.

 

Here’s how:

 

In your Protractor configuration file, add the following lines:

[code language=”ruby”]var HttpsProxyAgent = require("https-proxy-agent");

var agent = new HttpsProxyAgent(‘http://<user>:<password>@<proxy host>:<port>’);

exports.config = {
agent: agent,
// things you had in there before
};[/code]

Then, in your node_modules/protractor/lib/driverProviders/sauce.js file (i.e. the DriverProvider for Sauce Labs in Protractor), add this:

[code language=”ruby”]this.sauceServer_ = new SauceLabs({
username: this.config_.sauceUser,
password: this.config_.sauceKey,
agent: this.config_.agent // this is the line you add
});[/code]

Once you have your https-proxy-agent in place as specified, your PUT request should go through, and your tests should pass (as seen in the Sane jobs).

The whole process, end-to-end, running in Gulp

If it does not satisfy you to simply run the “sc” binary from the command line and then kick off a Gulp task that relies on the tunnel already existing, you can get everything to run in Gulp from end to end. To do this, you need to require sauce-tunnel in your Gulpfile (thus you might as well run npm install sauce-tunnel from the same directory that your Gulpfile exists). Then, you need to make some changes to the Gulpfile: add some additional tasks for tunnel setup & teardown, and some special provisions so these tasks are executed in series rather than in parallel.

[code language=”ruby”]var SauceTunnel = require(‘sauce-tunnel’);
var tunnel;

gulp.task(‘sauce-start’, function(cb) {
tunnel = new SauceTunnel("<your Sauce ID>", "<Your Sauce Key>", "<Sauce tunnel name — this must be specified and match the tunnel-identifier name specified in the Protractor conf file>");
// >>>> Enhance logging – this function was adapted from that Node plugin for Grunt, which runs grunt-mocha-wd.js
var methods = [‘write’, ‘writeln’, ‘error’, ‘ok’, ‘debug’];
methods.forEach(function (method) {
tunnel.on(‘log:’+method, function (text) {
console.log(method + ": " + text);
});
tunnel.on(‘verbose:’+method, function (text) {
console.log(method + ": " + text);
});
});
// <<<< End enhance logging

tunnel.start(function(isCreated) {
if (!isCreated) {
cb(‘Failed to create Sauce tunnel.’);
}
console.log("Connected to Sauce Labs.");
cb();
});
});

gulp.task(‘sauce-end’, function(cb) {
tunnel.stop(function() {
cb();
});
});

gulp.task(‘sauce-test’, [‘sauce-start’], function () {
gulp.src(‘<path to your Protractor spec file(s)>’)
.pipe2)protractor({
configFile: ‘<path to your Protractor conf file>’
}
.on(‘error’, function (e) {
throw e;
}).on(‘end’, function() {
console.log(‘Stopping the server.’);
gulp.run(‘sauce-end’);
}));
});[/code]

Note here that the cb() function is new to Gulp, yet the “gulp.run()” construct mentioned toward the bottom of the code snippet above is actually deprecated. I will get around to fixing that once it stops working, but I think that in the grand scheme of priorities, I’d rather clean the second-story gutter with only a plastic Spork first before fixing that deprecated line. :-P

At this point, you should be able to run a test with Sauce Connect from end to end in Gulp without any extra intervention. However, if Gulp is failing because it can’t write to a file in a temporary folder pertaining to the tunnel (whose name you picked), then you can always run gulp as root find a way to have it save to a different temporary location that you have access to, since it’s always good to minimize running things as root.

One Brief Important Interruption about Lingering sc Instances…

If these instructions haven’t worked out 100% for you, or you are me and spent a great deal of time exploring this, you may be frustrated with how many times Sauce Connect hangs around when there’s been a problem. You can’t start the Sauce Connect binary again if it’s already running, yet if you try to do this, it gives you an esoteric error message that does not make it apparent that this is indeed what happened. To remedy this in a *nix operating system, simply write “pkill sc”, as long as you don’t have other critical processes that have “sc” in their name. In my case, the other processes with “sc” in the name are running under a different user, and I don’t have privileges to kill them (I’m not logged in as root nor running “sudo pkill sc”), so it doesn’t do anything harmful to the system.

Shutting It Down Cleanly

In order to properly shut down sc, you may have noticed one final Gulp task in the code snippet above — “sauce-end”. This task, in the background, runs an HTTP DELETE operation on saucelabs.com, and is subject to corporate proxy rules once again. To circumvent this, you can simply require https-proxy-agent in node_modules/sauce-tunnel/index.js (like we did in the Protractor configuration file), and set up the agent in the same way. In this case, you will edit the code in node_modules/sauce-tunnel/index.js as such:

[code language=”ruby”]// other pre-existing requires
var HttpsProxyAgent = require("https-proxy-agent");

var agent = new HttpsProxyAgent(‘http://<user>:<password>@<proxy host>:<port>’);

// other existing code
this.emit(‘verbose:debug’, ‘Trying to kill tunnel’);
request({
method: "DELETE",
url: this.baseUrl + "/tunnels/" + this.id,
json: true,
agent: agent // this is the line you add
}, // … etc[/code]

Now, obviously, this is not sustainable if you wish to ever upgrade sauce-tunnel or wish not to include a proxy agent. For this, I will be submitting “less hacky” fixes to the respective GitHub repositories for these open-source Node modules in order to make it easier for all users in the future to use Sauce Connect with Protractor through their corporate proxies.

Nevertheless, there’s no harm in this DELETE call failing, other than it makes the Gulp task stall another minute or so, which is annoying when you’re at work late trying to learn how all this stuff works in order to finish off some polishing touches on your big project.

To recap running everything from end to end in Gulp:

[Assuming you’ve set up all your Node packages to run a Protractor script with the conf file set up for Sauce Labs, as described above]:

  • In the same directory as your Gulpfile, run:
    npm install sauce-tunnel
  • Set up your Gulpfile in the manner I described above, with the sauce-tunnel require, and the “sauce-start”, “sauce-end”, and “sauce-test” tasks, and with the “Sauce tunnel name” (3rd argument in new SauceTunnel()) set to the same value as the Protractor config file “tunnel-identifier” value. Be sure to study all the possible values that “new SauceTunnel()” takes, as you can pass in options to the sc binary if you need them.
  • If you are behind a corporate proxy or firewall, make the recommended edits to the Sauce DriverProvider at node_modules/protractor/lib/driverProviders/sauce.js, and to the sauce-tunnel module at node_modules/sauce-tunnel/index.js.
  • Run the Gulp task.
    gulp sauce-test
    or
    sudo gulp sauce-test

Once again, I plan to check in “more sustainable” and “less hacky” code to help you deal with corporate proxies in the future without making temporary workarounds to downloaded modules.

 

Have an idea for a blog post, webinar, or more? We want to hear from you! Submit topic ideas (or questions!) here.

References   [ + ]

1. protractor({
configFile: ‘path to the config file I described above’
}
2. protractor({
configFile: ‘<path to your Protractor conf file>’
}

Re-Blog: Testing JavaScript on Various Platforms with Karma and Sauce Labs

November 6th, 2014 by Bill McGee

Thanks to Ben Ripkins for this great blog post on codecentric!

See an excerpt below.

The web can be a brutal environment with various combinations of browsers and operating systems (platforms). It is quite likely that your continuous integration setup only covers a small portion of your users’ platforms. The unfortunate truth is that testing on these platforms is necessary to accommodate for compliance differences and partial support of standards and technologies like HTML, JavaScript, CSS and network protocols.

Sauce Labs is a service that can be used to test web applications by simulating user behaviour or executing JavaScript unit tests. Sauce Labs eases testing by supporting 442 different platform combinations and additionally recording the test execution. This means that you can actually see what a user might have seen and therefore trace errors easily.

For the purpose of this blog post we will cover the following setup:

  • Sources and tests written using CommonJS modules. Modules are transpiled for the web using Browserify.
  • Mocha as our test framework.
  • Local test execution in the headless PhantomJS browser.
  • Testing against the following platforms in our CI environment:
    • Chrome 35 on Windows 7,
    • Firefox 30 on an arbitrary operating system,
    • iPhone 7.1 on OS X 10.9 and
    • Internet Explorer on Windows 8.1.

(more…)

Announcing Sauce Integration With Siesta

September 16th, 2014 by Bill McGee

siesta-logoSauce is thrilled to announce that we’ve integrated with Siesta by Bryntum!

Siesta is a JavaScript unit testing tool that can help you test any JavaScript code and also perform testing of the DOM and simulate user interactions. The tool can be used together with any type of JavaScript codebase – jQuery, Ext JS, NodeJS, Dojo, YUI etc. Using the API, you can choose from many types of assertions ranging from simple logical JS object comparisons to verifying that an HTML element is visible in the DOM. It comes in two versions: Lite and Standard. With Lite you can easily test your JavaScript in the browser, and with Standard you can also automate your tests (highly recommended).

Sauce is a cloud-based tool that enables you to securely test your web and mobile apps across 385+ browser/OS/platform/device combinations.

How does Siesta work with Sauce?

(more…)

Recap: Fearless Browser Test Automation [WEBINAR]

August 8th, 2014 by Bill McGee

john_david_daltonThanks to those of you who attended our last webinar, Fearless Browser Test Automation, featuring John-David Dalton. This webinar was presented by O’Reilly and Sauce Labs, a provider of the world’s largest automation cloud for testing web and native/hybrid mobile applications.

We hope you found John-David’s perspectives helpful, and that if you’re now doing manual testing on a limited range of browsers – or no testing at all – you’re ready for the awesomeness of automated cross-browser testing.

Missed the webinar? You can watch it in its entirety HERE.

Still scared? Never fear: you can get more tips and tools at the Sauce Labs Documentation Center. If you’re new to JavaScript testing, here are some resources to get you started.

Lastly, please follow our friends at O’Reilly at @oreillymedia, Sauce Labs at @saucelabs, and John-David at @jdalton to keep up with the latest, and feel free to share this webinar using the hashtag #fearlesstesting.

Adding Custom Methods to Data Models with Angular $resource

July 24th, 2014 by Bill McGee

Sauce Labs software developer Alan Christopher Thomas and his team have been hard at work updating our stack. He shared with us some insight into their revised dev process, so we thought we’d show off what he’s done. Read his follow-up post below.

Thanks for your great feedback to this post. Previously we examined three different approaches to modeling data in AngularJS. We’ve since incorporated some of your feedback, so we wanted to share that information here. You can also see updates we made in our original post.

One of our commenters made mention of a cleaner approach to adding custom methods to $resource models when our API response response allows it, using angular.extend().

In this implementation, we’re imagining an API response that looks like this:

[
  {
    "breakpointed": null,
    "browser": "android",
    "browser_short_version": "4.3",
    ...
  },
  {
    ...
  }
  ...
]

Each of the response objects in the list is a “Job” that contains a whole lot of metadata about an individual job that’s been run in the Sauce cloud.

We want to be able to iterate over the jobs to build a list for our users, showing the outcome of each: “Pass,” “Fail,” etc.

Our template looks something like this:

{{ job.getResult() }} {{ job.name }}

Note the job.getResult() call. In order to get this convenience, however, we need to be able to attach a getResult() method to each Job returned in the response.

So, here’s what the model looks like, using Angular $resource:

angular.module('job.models', [])
    .factory('Job', ['$resource', function($resource) {
        var Job = $resource('/api/jobs/:jobId', {
            full: 'true',
            jobId: '@id'
        });

        angular.extend(Job.prototype, {
            getResult: function() {
                if (this.status == 'complete') {
                    if (this.passed === null) return "Finished";
                    else if (this.passed === true) return "Pass";
                    else if (this.passed === false) return "Fail";
                }
                else return "Running";
            }
        });

        return Job;
    }]);

Note that since each resulting object returned by $resource is a Job object itself, we can simply extend Job.prototype to include the behavior we want for every individual job instance.

Then, our controller looks like this (revised from the original post to make use of the not-so-obvious promise):

angular.module('job.controllers', [])
    .controller('jobsController', ['$scope', '$http', 'Job', function($scope, $http, Job) {
        $scope.loadJobs = function() {
            $scope.isLoading = true;
            var jobs = Job.query().$promise.then(function(jobs) {
                $scope.jobs = jobs;
            });
        };

        $scope.loadJobs();
    }]);

The simplicity of this example makes $resource a much more attractive option for our team’s data-modeling needs, especially considering that for simple applications, custom behavior isn’t incredibly unwieldy to implement.

– Alan Christopher Thomas, Software Developer, Sauce Labs

AngularJS Data Models: $http VS $resource VS Restangular

July 15th, 2014 by Bill McGee

Sauce Labs software developer Alan Christopher Thomas and his team have been hard at work updating our stack. He shared with us some insight into their dev process, so we thought we’d show off what he’s done. Read his post below.

Over the past few months, the Sauce Labs web team has fixed its crosshairs on several bits of our stack that needed to be refreshed. One of those bits is the list of jobs all customers see when they first log into their account. It looks like this:

stack

Our current app is built in Backbone.js. We vetted lots of options for frontend MVC frameworks and data binding that could replace and simplify the existing Backbone code: Ember, Angular, React, Rivets, Stapes, etc.

After lots of research, building some stuff, and personal preference, our team decided we were most comfortable with Angular.

We had one more thing we wanted to verify, though, before settling.

How complicated will it be to model our data?

This was the first question on most of our minds, and it was the one question about Angular that Google fell into a void of silence. Backbone has models and collections. Ember.js has Ember Data and ember-model. Stapes has extensible observable objects that can function as collections. But what about Angular? Most examples we found were extremely thin on the data layer, just returning simple JavaScript objects and assigning them directly to a $scope model.

So, we built a small proof of concept using three different AngularJS data modeling techniques. This is a dumbed down version of our Jobs page, which only displays a list of jobs and their results. Our only basic requirement was that we kept business logic out of our controllers so they wouldn’t become bloated.

We gave ourselves some flexibility with the API responses and allowed them to be wrapped with an object or not wrapped to emphasize the strengths of each approach. However, all calls require limit and full parameters to be passed in the GET query string.

Here’s what we wanted the resulting DOM template to look like:

{{ job.getResult() }} {{ job.name }}

Note that each resulting job should be able to have a getResult() method that displays a human-readable outcome in a badge. The rendered page looks like this:

jobs

The Code: $http vs $resource vs Restangular

So, here’s the resulting code for all three approaches, each implementing a getResult() method on every job.

$http

In this approach, we created a service that made the API calls and wrapped each result as a Job() object with a getResult() method defined on the prototype.

API Response Format:

{
  "meta": {}, 
  "objects": [
    {
      "breakpointed": null, 
      "browser": "android", 
      "browser_short_version": "4.3",
      ...
    },
    {
      ...
    },
    ...
  ]
}

models.js:

angular.module('job.models', [])
    .service('JobManager', ['$q', '$http', 'Job', function($q, $http, Job) {
        return {
            getAll: function(limit) {
                var deferred = $q.defer();

                $http.get('/api/jobs?limit=' + limit + '&full=true').success(function(data) {
                    var jobs = [];
                    for (var i = 0; i < data.objects.length; i ++) {
                        jobs.push(new Job(data.objects[i]));
                    }
                    deferred.resolve(jobs);
                });

                return deferred.promise;
            }
        };
    }])
    .factory('Job', function() {
        function Job(data) {
            for (attr in data) {
                if (data.hasOwnProperty(attr))
                    this[attr] = data[attr];
            }
        }

        Job.prototype.getResult = function() {
            if (this.status == 'complete') {
                if (this.passed === null) return "Finished";
                else if (this.passed === true) return "Pass";
                else if (this.passed === false) return "Fail";
            }
            else return "Running";
        };

        return Job;
    });

controllers.js:

angular.module('job.controllers', [])
    .controller('jobsController', ['$scope', 'JobManager', function($scope, JobManager) {
        var limit = 20;
        $scope.loadJobs = function() {
            JobManager.getAll(limit).then(function(jobs) {
                $scope.jobs = jobs;
                limit += 10;
            });
        };

        $scope.loadJobs();
    }]);

This approach made for a pretty simple controller, but since we needed a custom method on the model, our services and factories quickly became verbose. Also, if we were to abstract away this behavior to apply to other data types (sub-accounts, tunnels, etc.), we might end up writing a whole lot of boilerplate.

$resource

UPDATE: Per Micke’s suggestion in the comments section below, we’ve posted a follow-up with a cleaner implementation of the $resource version of the Job model. It parses an API response similar to the one shown in the Restangular scenario and allows for much cleaner method declaration usingangular.extend.

Angular provides its own $resource factory, which has to be included in your project as a separate dependency. It takes away some of the pain we felt in writing our JobManager service boilerplate code and allows us to apply our custom method directly to the $resource prototype, then transform responses to be wrapped in itself.

API Response Format:

{
  "items": [
    {
      "breakpointed": null, 
      "browser": "android", 
      "browser_short_version": "4.3", 
      ...
    }, 
    {
      ...
    }
    ...
  ]
}

models.js:

angular.module('job.models', [])
    .factory('Job', ['$resource', function($resource) {
        var Job = $resource('/api/jobs/:jobId', { full: 'true', jobId: '@id' }, {
            query: {
                method: 'GET',
                isArray: false,
                transformResponse: function(data, header) {
                    var wrapped = angular.fromJson(data);
                    angular.forEach(wrapped.items, function(item, idx) {
                        wrapped.items[idx] = new Job(item);
                    });
                    return wrapped;
                }
            }
        });

        Job.prototype.getResult = function() {
            if (this.status == 'complete') {
                if (this.passed === null) return "Finished";
                else if (this.passed === true) return "Pass";
                else if (this.passed === false) return "Fail";
            }
            else return "Running";
        };

        return Job;
    }]);

controllers.js:

angular.module('job.controllers', [])
    .controller('jobsController', ['$scope', 'Job', function($scope, Job) {
        var limit = 20;
        $scope.loadJobs = function() {
            var jobs = Job.query({ limit: limit }, function(jobs) {
                $scope.jobs = jobs.items;
                limit += 10;
            });
        };

        $scope.loadJobs();
    }]);

This approach also makes for a pretty elegant controller, except we really didn’t like that the query() methodtook a callback instead of giving us a promise didn’t return a promise directly, but gave us an object with the promise in a $promise attribute (thanks Louis!). It felt pretty un-Angular a little ugly. Also, the process of transforming result objects and wrapping them felt like a strange dance to achieve some simple behavior (UPDATE: see this post). We’d probably end up writing more boilerplate to abstract that part away.

Restangular

Last, but not least, we gave Restangular a shot. Restangular is a third-party library that attempts to abstract away pain points of dealing with API responses, reduce boilerplate, and do it in the most Angular-y way possible.

API Response Format:

[
  {
    "breakpointed": null, 
    "browser": "android", 
    "browser_short_version": "4.3", 
    ...
  }, 
  {
    ...
  }
  ...
]

models.js:

angular.module('job.models', [])
  .service('Job', ['Restangular', function(Restangular) {
    var Job = Restangular.service('jobs');

    Restangular.extendModel('jobs', function(model) {
      model.getResult = function() {
        if (this.status == 'complete') {
          if (this.passed === null) return "Finished";
          else if (this.passed === true) return "Pass";
          else if (this.passed === false) return "Fail";
        }
        else return "Running";
      };

      return model;
    });

    return Job;
  }]);

controllers.js:

angular.module('job.controllers', [])
  .controller('jobsController', ['$scope', 'Job', function($scope, Job) {
    var limit = 20;
    $scope.loadJobs = function() {
      Job.getList({ full: true, limit: limit }).then(function(jobs) {
        $scope.jobs = jobs;
        limit += 10;
      });
    };
$scope.loadJobs();
}]);

In this one, we got to cheat and use Restangular.service(), which provides all the RESTful goodies for us. It even abstracted away writing out full URLs for our API calls. Restangular.extendModel() gives us an elegant way to attach methods to each of our model results, making getResult() straightforward and readable. Lastly, the call in our controller returns a promise! This let us write the controller logic a bit more cleanly and allows us to be more flexible with the response in the future.

tldr; Concluding Thoughts

Each of the three approaches have their appropriate use cases, but I think in ours we’re leaning toward Restangular.

$http – $http is built into Angular, so there’s no need for the extra overhead of loading in an external dependency. $http is good for quick retrieval of server-side data that doesn’t really need any specific structure or complex behaviors. It’s probably best injected directly into your controllers for simplicity’s sake.

$resource – $resouce is good for situations that are slightly more complex than $http. It’s good when you have pretty structured data, but you plan to do most of your crunching, relationships, and other operations on the server side before delivering the API response. $resource doesn’t let you do much once you get the data into your JavaScript app, so you should deliver it to the app in its final state and make more REST calls when you need to manipulate or look at it from a different angle. Any custom behavior on the client side will need a lot of boilerplate.

Restangular – Restangular is a perfect option for complex operations on the client side. It lets you easily attach custom behaviors and interact with your data in much the same way as other model paradigms you’ve used in the past. It’s promise-based, clean, and feature-rich. However, it might be overkill if your needs are basic, and it carries along with it any extra implications that come with bringing in additional third-party dependencies.

Restangular seems to be a decently active project with the prospect of a 2.0 that’s compatible with Angular 2.0, currently a private repository. However, a lot of the project’s progress seem to be dependent on the work of a single developer for the time being.

We’re looking forward to seeing how Restangular progresses and whether or not it seems like a good fit for us at Sauce! If this blog post has piqued your interest and you’ll feel as passionate about web development as we do feel free to check out our career opportunities here at Sauce or send us a note.

Alan Christopher Thomas, Software Developer, Sauce Labs

Re-Blog: JavaScript Multi Module Project – Continuous Integration

June 11th, 2014 by Bill McGee

lubos-krnacOur friend Lubos Krnac describes how to integrate Sauce with Protractor in a quest to implement continuous integration in his JavaScript multi module project with Grunt.

Below is a quote from his most recent blog post along side some code.

Read the rest of his post to get the full how-to here.

An important part of this setup is Protractor integration with Sauce Labs. Sauce Labs provides a Selenium server with WebDiver API for testing. Protractor uses Sauce Labs by default when you specify their credentials. Credentials are the only special configuration in test/protractor/protractorConf.js (bottom of the snippet). The other configuration was taken from the grunt-protractor-coverage example. I am using this Grunt plug-in for running Protractor tests and measuring code coverage.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
// A reference configuration file.
exports.config = {
  // ----- What tests to run -----
  //
  // Spec patterns are relative to the location of this config.
  specs: [
    'test/protractor/*Spec.js'
  ],
  // ----- Capabilities to be passed to the webdriver instance ----
  //
  // For a full list of available capabilities, see
  // and
  capabilities: {
    'browserName': 'chrome'
    //  'browserName': 'firefox'
    //  'browserName': 'phantomjs'
  },
  params: {
  },
  // ----- More information for your tests ----
  //
  // A base URL for your application under test. Calls to protractor.get()
  // with relative paths will be prepended with this.
  baseUrl: 'http://localhost:3000/',
  // Options to be passed to Jasmine-node.
  jasmineNodeOpts: {
    showColors: true, // Use colors in the command line report.
    isVerbose: true, // List all tests in the console
    includeStackTrace: true,
    defaultTimeoutInterval: 90000
  },
  
  sauceUser: process.env.SAUCE_USERNAME,
  sauceKey: process.env.SAUCE_ACCESS_KEY
};

You may ask “how can I use localhost in the configuration, when a remote selenium server is used for testing?” Good question. Sauce Labs provides a very useful feature called Sauce Connect. It is a tunnel that emulates access to your machine from a Selenium server. This is super useful when you need to bypass company firewall. It will be used later in main project CI configuration.

Have an idea for a blog post, webinar, or more? We want to hear from you! Submit topic ideas (or questions!) here.

JavaScript Unit Testing API Revamped

February 19th, 2014 by Jonah Stiennon

Sauce provides a shortcut API for running Javascript unit tests (like Jasmine, Qunit, Mocha, and YUI Test) on all browsers using our cloud.

The old way of doing things:

Before the unit test API was added, running frontend javascript tests using Selenium was pretty messy. One had to point the Selenium browser at the page which ran/reported the tests then inspect DOM elements on the page looking for the test results.

DOM unit testing

See how we’re getting the number of passing tests from the element with id=”mocha-stats”?

This was pretty dependent on the styling of the page, and can be an intensive amount of logic to put into a Selenium test, especially if you want to parse all the individual assertions.

The new way of doing things:

Now you can let Sauce Labs take care of all the tedium!

Instead of setting up a Webdriver and sending Selenium commands to our servers, just fire off a single HTTP request:

curl -X POST https://saucelabs.com/rest/v1/$SAUCE_USERNAME/js-tests         
     -u $SAUCE_USERNAME:$SAUCE_ACCESS_KEY -H 'Content-Type: application/json'
     --data '{
        "platforms": [["Windows 7", "firefox", "20"],
                      ["Linux", "googlechrome", ""]],        
        "url": "https://saucelabs.com/test_helpers/front_tests/index.html",
        "framework": "jasmine"}'

The Sauce servers point a browser at the test page and get the results. We parse the results depending on the framework you’re using and display them in a friendly manner on the Job Details page.

Failing Mocha and Qunit test reports on Sauce Labs

Failing Mocha and Qunit test reports on Sauce Labs

We now report the specific tests which fail, no more hunting through screenshots/videos.

The bad news:

Sauce doesn’t inspect DOM elements to get your test results, it’s much more robust. Buuuuuuut, it relies on you making the test results available in the global scope of the Javascript on the page. Once you add the code appropriate for your framework, our servers gather the data, parse it, and display it.

An extra feature we get from this is support for an arbitrary “custom” unit test report. If you set `window.global_test_restults` to an object that looks like this:

{
  "passed": 4,
  "failed": 0,
  "total": 4,
  "duration": 4321,
  "tests": [
    {
      "name": "foo test",
      "result": true,
      "message": "so foo",
      "duration": 4000
    },
    {
      "name": "bar test",
      "result": true,
      "message": "passed the bar exam",
      "duration": 300
    },
    {
      "name": "baz test",
      "result": true,
      "message": "passed",
      "duration": 20
    },
    {
      "name": "qux test",
      "result": true,
      "message": "past",
      "duration": 1
    }
  ]
}

We’ll display the results and report the test status automatically.

customUnitTest

Enjoy the new reporting, if this gets enough use we can expand support to more frameworks and see if we can inject the reporting code into test pages when we test them, lessening the work for the developer.

Automatically Test Your JavaScript Framework with BrowserSwarm

September 26th, 2013 by Ashley Wilson

Today we’re excited to announce, along with the Internet Explorer team at Microsoft and appendTo, a new tool called BrowserSwarm for automatically testing JavaScript across various browsers.

Powered on the backend by Sauce Labs‘ cloud testing platform, BrowserSwarm provides an easy way for JavaScript framework authors to automatically test their projects across multiple browser and devices with the click of a button.

Connecting directly to your Github repo, tests are automatically kicked off and run on Sauce each time your team makes a change. Sauce has 160+ browser / OS combinations in the cloud so developers don’t waste precious time and resources setting this up themselves.  After the tests are done, you can view the video, screenshots and raw logs to identify and debug failures faster.

Why BrowserSwarm?

With an increasingly fragmented browser and device market, it’s ever more important for developers to ensure cross-browser compatibility of their app or framework before public release. But maintaining a lab for test infrastructure is just not a reality for many open source projects relying on the work of a community to move development efforts forward.

That’s where BrowserSwarm comes in. The goal of BrowserSwarm is to help developers spend less time testing and more time innovating – while ensuring web frameworks work exactly as expected. Projects like Dojo, Modernizr and Backbone.js have signed up to test their projects with BrowserSwarm, with more on the way.

How to add your Project to BrowserSwarm

Are you a framework author needing wanting to test your JavaScript? Begin by setting-up an account in minutes at http://browserswarm.com. Then:

  1. Tell us who you are

  2. Add your GitHub repo

  3. Wait for BrowserSwarm to complete the installation

  4. Publish your latest code

Going Forward

Along with BrowserSwarm, we recently announced JavaScript Unit Testing on Sauce. Sauce Labs also powers parts of modern.IE by automating the Compatibility Inspector tool so developers can easily test across modern versions of Internet Explorer like 11, 10, and 9 while supporting older versions. Between these various efforts, our hope is that more developers and project authors can make testing an integral part of their development efforts.

If you have feedback on how Microsoft, appendTo, and Sauce can improve BrowserSwarm, please shoot a note to let us know.

Happy testing!