Managing Windows Servers with Chef, Book Review

Harness the power of Chef to automate management of Windows-based systems using hands-on examples.

Managing Windows Servers with Chef

Recently, I had the opportunity to read, ‘Managing Windows Servers with Chef’, authored John Ewart, and published in May, 2014 by Packt Publishing. At a svelte 110 pages in paperback form, ‘Managing Windows Servers with Chef’, is a quick read, packed with concise information, relevant examples, and excellent code samples. Available on Packt Publishing’s website for a mere $11.90 for the ebook, it a worthwhile investment for anyone considering Chef Software’s Chef product for automating their Windows-based infrastructure.

As an IT professional, I use Chef for both Windows and Linux-based IT automation, on a regular basis. In my experience, there is a plethora of information on the Internet about properly implementing and scaling Chef. There is seldom a topic I can’t find the answers to, online. However, it has also been my experience, information is often Linux-centric. That is one reason I really appreciated Ewart’s book, concentrating almost exclusively on Windows-based implementations of Chef.

IT professionals, just getting starting with Chef, or migrating from Puppet, will find the ‘Managing Windows Servers with Chef’ invaluable. Ewart does a good job building the user’s understanding of the Chef ecosystem, before beginning to explain its application to a Windows-based environment. If you are considering Chef versus Puppet Lab’s Puppet for Windows-based IT automation, reading this book will give you a solid overview of Chef.

Seasoned users of Chef will also find the ‘Managing Windows Servers with Chef’ useful. Professionals quickly master the Chef principles, and develop the means to automate their specific tasks with Chef. But inevitably, there comes the day when they must automate something new with Chef. That is where the book can serve as a handy reference.

Of all the books topics, I especially found value in Chapter 5 (Managing Cloud Services with Chef) and Chapter 6 (Going Beyond the Basics – Testing Recipes). Even large enterprise-scale corporations are moving infrastructure to cloud providers. Ewart demonstrates Chef’s Windows-based integration with Microsoft’s Azure, Amazon’s EC2, and Rackspace’s Cloud offerings. Also, Ewart’s section on testing is a reminder to all of us, of the importance of unit testing. I admit I more often practice TAD (‘Testing After Development’) than TDD (Test Driven Development), LOL. Ewart introduces both RSpec and ChefSpec for testing Chef recipes.

I recommend ‘Managing Windows Servers with Chef’ for anyone considering Chef, or who is seeking a good introductory guide to getting started with Chef for Windows-based systems.

 

, , , , , ,

Leave a comment

Data-Driven Forms with AngularJS’s Two-Way Data Binding and Custom Directives

Use the two-way data binding and custom directives features of AngularJS to develop data-driven, interactive forms.

Introduction

AngularJS has exploded on to the web-application development scene. Since being introduced in 2009, AngularJS’s use has grown exponentially. Its wide range of features and ease of use make it an ideal tool for rapidly developing modern web-applications. Combined with other modern JavaScript tools, such as Node, Express, Twitter Bootstrap, Yeoman, and NoSQL databases such as MongoDB, AngularJS developers can create robust, full-stack JavaScript applications.

A primary feature of AngularJS is two-way data binding. According to AngularJS’s website, ‘data-binding is the automatic synchronization of data between the model and view. The way that Angular implements data-binding lets you treat the model as the single-source-of-truth in your application. The view is a projection of the model at all times. When the model changes, the view reflects the change, and vice versa.‘ In the past, developers spent much of their coding time wiring up UI components to the application’s data model. AngularJS has greatly simplified this process.

Another key feature of AngularJS are directives. At a high level, according to AngularJS’ site, ‘directives are markers on a DOM element (such as an attribute, element name, comment or CSS class) that tell AngularJS’s HTML compiler to attach a specified behavior to that DOM element or even transform the DOM element and its children.‘ AngularJS provides many built-in directives, including ngModel, ngBind, ngInclude, ngRepeat, and ngChange. These directives are the building blocks of an AngularJS application. We will use many of these built-in directives in this post.

In addition to built-in directives, AngularJS allows us to create custom directives. Custom directives are a powerful feature, allowing us to encapsulate our own reusable DOM manipulation functionality.

The Sample Project

There is an infinite variety of web-based forms (‘electronic forms’). We interact with web-based forms at work, at home, and at school. Forms serve the primary purpose of collecting data user. Web-based forms allow us to order products and services over the internet, file our taxes, manage our benefits at work, track our time, and take online classes.

Tests or quizzes are a perfect example of web-based forms to demonstrate AngularJS’s many strengths, including data-binding and custom directives. In this post, we will create a series of interactive quizzes on the theme of AngularJS – sort of a learning opportunity inside a learning opportunity. Quizzes often contain several common types of question/answer formats, including true-false, multiple-choice, and multiple-correct, ordering, matching, short-answer, essay, and so forth. These question/answer formats take advantage of all the HTML form elements, including radio buttons, check-boxes, text fields, drop-down lists, list boxes, and text areas. We will build the quizzes from static JSON data files, and using AngularJS’s services, controllers, routes, views, templates, directives, and custom directives.

In the first example, we will use AngularJS’s factory service, controller, partial templates, view, routing, and built-in directive features to read JSON data from a file, and display and validate a basic true-false quiz. In the second example, we will expand our true-false quiz to contain additional types of questions, including multiple-choice and multiple-correct. For the advanced quiz, we will make use of use custom directives and partial view templates. These two new features will allow us to increase the quizzes complexity without substantially increasing the complexity of code we need to write.

Installing and Configuring the Project

This post’s project is available on GitHub. The easiest way to obtain all the source code, is to clone the project with Git. Once you have cloned the project, don’t forget to install the npm and bower packages. All commands are shown below. The minimum requirements for the project, are to have Bower, Grunt, npm, and Git installed.

git clone https://github.com/garystafford/angular-quiz.git
cd angular-quiz
npm install
bower install

Alternately, if you are experienced building JavaScript applications with the scaffolding tool, yo, you can create a new project and recreate the code yourself. To use generator-angular’s code generators, you will need yo installed, in addition to Bower, Grunt, npm, and Git. Since this post’s project is based on the Yeoman’s generator-angular, you can use npm to install Yeoman’s generator-angular. Afterwards, using generator-angular’s available code generators, you can easily reproduce the post’s basic project structure.

npm install -g generator-angular

# Use generator-angular code generators to create project components
# Instructions here: https://github.com/yeoman/generator-angular
mkdir quiz-app && cd $_
yo angular quiz
yo angular:route quizAdvanced
yo angular:factory quizAdvancedFactory
yo angular:directive quizTrueFalseDirective
Using yo with generator-angular to Set-up a New Application

Using yo with generator-angular to Set-up a New Application

Using yo with generator-angular to Create New Components

Using yo with generator-angular to Create New Components

If you used the generator-angular code generator to create the project yourself, using the above instructions, your module will be called ‘quizApp’. The application name, found in the ‘package.json’ and ‘bower.json’ files, will be ‘quiz’. I changed my project’s module and app names to be more descriptive, along with the names of the routes, factories, directives, and other components. They will also vary slightly using the code generators.

Also, if you used the generator-angular code generator to create the project yourself, you may need to install a few additional npm and bower packages, not part of generator-angular project, to reproduce this post’s project, exactly.

Project Structure

The project structure follows the generator-angular format. Most core application files are kept in the ‘app’ folder. This post’s project has added the ‘app/data’ folder, which holds the quiz data, and the ‘app/scripts/partials’, which holds the partial view templates for the custom directives (explained later).

Project View from WebStorm 8

Project View from WebStorm 8

Starting the Project

The project is started using the ‘grunt serve‘ command. Using the grunt server, the project be hosted on ‘localhost’, port 9000, by default. This can be changed to a specific hostname or IP address by editing the ‘Gruntfile.js’ file’s ‘connect‘ task.

Testing the Project

There are some basic tests created using the Karma, Test Runner for JavaScript. These tests are run using the ‘grunt test‘ command. Test are set to run on port 8092, using the PhantomJS web browser. PhantomJS, if you’re not familiar, is a headless WebKit scriptable with a JavaScript API. PhantomJS is ideal for use with Continuous Integration Servers, such as TravisCI. If you do not have PhantomJS installed, and plan to run the tests, change the ‘browser‘ property in the ‘karma.conf.js’ file, located in the project’s root directory. Chrome is a good alternative for local testing. Test results for this GitHub project can be reviewed on TravisCI.

Creating a complete set of unit tests for the advanced quiz proved challenging based on its nested, partial view templates, described in the Advanced Quiz section. I may add a more complete set of unit test in the future.

Basic Quiz

The first quiz is a six-question, basic true-false format form. The user answers all six questions, and then pushes a button to display the results.

Basic Quiz Before User Input

Basic Quiz Before User Input

Basic Quiz With User Input

Basic Quiz With User Input

The basic quiz uses a single controller (quizBasicController.js), single factory service (quizBasicFactory.js), single route (apps.js – ‘/quizBasic’), and a single partial view template (quiz-basic.html), in addition to the main layout (index.html). All these components are part the ‘quizModule’ AngularJS module. I’ve attempted to illustrate these relationships in the diagram, below.

The factory service (quizBasicFactory.js) uses $resource, a service in AngularJS’s ngResource module, to load the contents of a local JSON-format file (quiz-basic.json).

angular.module('quizModule')
 .factory('quizBasicFactory', function ($resource) {
 return $resource('./data/quiz-basic.json');
 });
{
  "name":      "Basic Quiz Example",
  "questions": [
    {
      "_id":      1,
      "question": "AngularJS is a declarative programming language.",
      "answer":   true
    },
    {
      "_id":      2,
      "question": "The acronym 'SPA' stands for Single-Page Application.",
      "answer":   true
    },
    {
      "_id":      3,
      "question": "AngularJS is written in C++.",
      "answer":   false
    }
    ...
  ]
}

The controller (quizBasicController.js), calls the factory service (quizBasicFactory.js), which returns the ‘data’ object.

angular.module('quizModule')
  .controller('QuizBasicController',
  function ($scope, quizBasicFactory) {
    var createResults;
    $scope.title = null; // quiz title
    $scope.quiz = {}; // quiz questions
    $scope.results = []; // user results

    quizBasicFactory.get(function (data) {
      $scope.title = data.name;
      $scope.quiz = data.questions;
      createResults();
    });

    // prepare array of result objects
    createResults = function () {
      var len = $scope.quiz.length;
      for (var i = 0; i < len; i++) {
        $scope.results.push({
          _id:        $scope.quiz[i]._id,
          answer:     $scope.quiz[i].answer,
          userChoice: null,
          correct:    null
        });
      }
    };

    // assign and check user's choice
    $scope.checkUserChoice = function (question, userChoice) {
      // assign the user's choice to userChoice
      $scope.results[question - 1].userChoice = userChoice;

      // check the user's choice against the answer
      if ($scope.results[question - 1].answer === userChoice) {
        $scope.results[question - 1].correct = 'Correct';
      } else {
        $scope.results[question - 1].correct = 'Incorrect';
      }
    };

    // only show results if all questions are answered
    $scope.checkQuizCompleted = function () {
      var len = $scope.results.length;
      for (var i = 0; i < len; i++) {
        if ($scope.results[i].userChoice === null) {
          return true;
        }
      }
      return false;
    };
  });
The 'data' Object Returned from Factory Service containing  JSON Data

The ‘data’ Object Returned from Factory Service containing JSON Data

Contents of the ‘data’ object are used to populate ‘$scope.quiz[]’, ‘$scope.title’, and ‘$scope.results[]’ properties. The $scope holds the quiz data ($scope.quiz[]), the quiz title ($scope.title), and the results ($scope.results[]). The ‘$scope.checkUserChoice()’ method stores the user’s answer in ‘$scope.results[].answer’ property, and evaluates if the answer is correct ($scope.results[].correct). The ‘$scope.checkQuizCompleted()’ method checks to make sure all questions have been answered before showing the results, when the user clicks the ‘Show Results’ button.

The $scope Containing Quiz, Title, and Results Properties

The $scope Containing Quiz, Title, and Results Properties

AngularJS bootstraps the application. Through AngularJS’s compiling and linking process, the partial view template (quiz-basic.html), shown below, the controller (quizBasicController.js), and the main layout (index.html), form the ‘\quizBasic’ View, which is presented to the user. Blogger, Dag-Inge Aas does a nice job of explaining this process in his post, Understanding template compiling in AngularJS.

<h4 class="title">{{title}}
<br/>

<!--quiz section-->
<form name="quiz">
  <div ng-repeat="question in quiz">
    <strong>{{question._id}}. {{question.question}}</strong>

    <div class="radio">
      <input required
             name="_id{{question._id}}"
             type="radio"
             value="true"
             ng-model="question.userChoice"
             ng-change="$parent.checkUserChoice(question._id, true)"/>
      <label for="_id{{question._id}}">True</label>
      <br/>
      <input required
             name="_id{{question._id}}"
             type="radio"
             ng-value="false"
             ng-model="question.userChoice"
             ng-change="$parent.checkUserChoice(question._id, false)"/>
      <label for="_id{{question._id}}">False</label>
    </div>
  </div>
</form>

<hr/>

<!--results section-->
<div ng-init="showAnswers=true">
  btn-sm"
          ng-click="showAnswers=checkQuizCompleted()">
    Show Results
  </button>
  <br/>
  <br/>

  <div ng-hide="showAnswers">
    <strong>Results</strong>

    <div ng-repeat="result in results">
      {{result._id}}. <span
        ng-class="result.correct == 'Correct' ? 'correct' : 'incorrect'">
        {{result.correct}}
      </span>
    </div>
  </div>

We load all the contents of the JSON data file into $scope and use the ‘ng-repeat‘ directive to iterate over the questions ($scope.quiz[]) and the results ($scope.results[]). Because of this, modifying existing questions and adding new ones is easy. This requires no additional coding, just a change to the JSON data file.

 Advanced Quiz

Using all the same basic building blocks as the basic quiz, with the addition of custom-directives, we can add complexity to our quiz, without a lot of additional coding. This advanced quiz has nine questions, including three true-false format, three multiple-choice format, and three multiple-correct format. As the user answers each questions, they are presented with the results, either ‘Correct’ or ‘Incorrect’.

Advanced Quiz Before User Input

Advanced Quiz Before User Input

Advanced Quiz With User Input

Advanced Quiz With User Input

Similar to the basic quiz, the advanced quiz uses a single controller (quizAdvancedController.js), factory service (quizAdvancedFactory.js), route (apps.js – ‘/quizAdvanced’), partial view template (quiz-advanced.html), and the main layout (index.html). Additionally, the advanced quiz uses a filter, three custom directives, and four partial view templates. The fourth partial view template, ‘quiz-choice-response.html’, is called by the first three partial view templates. It contains common DOM elements. Like the basic quiz, all these components are part the ‘quizModule’ module. I’ve attempted to illustrate these relationships in the diagram, below.

Just like with the basic quiz, the factory service (quizAdvancedFactory.js) uses $resource to load the contents of a local JSON-format file (quiz-advanced.json). This time however, the JSON file contains three types of questions, each with a slightly different schema. The three different question types are shown in the code snippet below. The true-false questions have a boolean value as the answer, the multiple choice questions, an integer as an answer, and the multiple correct questions, an array of integers as an answer.

angular.module('quizModule')
  .factory('quizAdvancedFactory', function ($resource) {
    return $resource('./data/quiz-advanced.json');
  });
{
  "name":      "Advanced Quiz Example",
  "questions": [
    {
      "_id":      1,
      "question": "AngularJS is written completely in JavaScript.",
      "type":     "True-false",
      "answer":   true
    },
    {
      "_id":      4,
      "question": "What does the acronym 'MVC' stand for?",
      "type":     "Multiple choice",
      "choices":  [
        {
          "_id":    1,
          "choice": "Method, Variable, Constant"
        },
        {
          "_id":    2,
          "choice": "Module, View, Constraint"
        },
        {
          "_id":    3,
          "choice": "Model, View, Controller"
        },
        {
          "_id":    4,
          "choice": "None of the above"
        }
      ],
      "answer":   3
    },
    {
      "_id":      7,
      "question": "Which of the following are associated with AngularJS?",
      "type":     "Multiple correct",
      "choices":  [
        {
          "_id":    1,
          "choice": "Controller"
        },
        {
          "_id":    2,
          "choice": "Interface"
        },
        {
          "_id":    3,
          "choice": "Route"
        },
        {
          "_id":    4,
          "choice": "View"
        },
        {
          "_id":    5,
          "choice": "Model"
        },
        {
          "_id":    6,
          "choice": "Generator"
        },
        {
          "_id":    7,
          "choice": "Service"
        },
        {
          "_id":    8,
          "choice": "Node"
        }
      ],
      "answer":   [1, 3, 4, 5, 7]
    }
    ...
  ]
}

The controller (quizAdvancedController.js), calls the factory service (quizAdvancedFactory.js), which returns the ‘data’ object, just like in the basic quiz example.

angular.module('quizModule')
  .controller('QuizAdvancedController',
  function ($scope, quizAdvancedFactory, filterFilter) {
    var createResults;
    $scope.title = null; // quiz title
    $scope.quiz = {}; // quiz questions
    $scope.results = []; // user results

    quizAdvancedFactory.get(function (data) {
      $scope.title = data.name;
      $scope.quiz = data.questions;
      createResults();
    });

    // prepare array of result objects
    createResults = function () {
      var len = $scope.quiz.length;
      for (var i = 0; i < len; i++) {
        $scope.results.push({
          _id:        $scope.quiz[i]._id,
          answer:     $scope.quiz[i].answer,
          userChoice: null,
          correct:    null
        });
      }
    };

    // used for multiple correct type questions
    $scope.checkUserMultiCorrectChoice = function (question, userChoice) {
      // create blank array
      if ($scope.results[question - 1].userChoice === null) {
        $scope.results[question - 1].userChoice = [];
      }

      // find choice, if not there the add or if there remove
      var pos = $scope.results[question - 1].userChoice.indexOf(userChoice);
      if (pos < 0) {
        $scope.results[question - 1].userChoice.push(userChoice);
      } else {
        $scope.results[question - 1].userChoice.slice(pos, 1);
      }

      // check the user's choice against the answer
      var answer = JSON.stringify($scope.quiz[question - 1].answer.sort());
      var choice = JSON.stringify($scope.results[question - 1].userChoice.sort());

      if (answer === choice) {
        $scope.results[question - 1].correct = true;
      } else {
        $scope.results[question - 1].correct = false;
      }
    };

    // used for multiple choice and true-false type questions
    $scope.checkUserChoice = function (question, userChoice) {
      // assign the user's choice to userChoice
      $scope.results[question - 1].userChoice = userChoice;

      // check the user's choice against the answer
      if ($scope.results[question - 1].answer === userChoice) {
        $scope.results[question - 1].correct = true;
      } else {
        $scope.results[question - 1].correct = false;
      }
    };

    // find a specific question
    $scope.filteredQuestion = function (questionId) {
      return filterFilter($scope.quiz, {_id: questionId});
    };
  });

For true-false and multiple-choice questions, the ‘$scope.checkUserChoice()’ method stores the user’s answer in the ‘$scope.results[].answer’ property. The method also evaluates if the answer is correct, and stores that value in the ‘$scope.results[].correct’ property. The method takes two input parameters, question id and user’s choice.

For multiple correct questions, the ‘$scope.checkUserMultiCorrectChoice()’ method does the same. The difference, for multiple-correct questions, the method stores both the multiple answers and multiple user choices in a pair of arrays, ‘$scope.results[].answer[]’ and ‘$scope.results[].userChoice[]’ object arrays. In addition to storing the user’s choices, the method removes user choices if they are deselected by the user, in the view.

Lastly, the ‘$scope.checkUserMultiCorrectChoice()’ method evaluates the user’s choices array against the correct answers array. In the example below, note the ‘$scope.results[6].answer[]’ array and the ‘$scope.results[6].userChoice[]’ array. They were determined to be equal by the ‘$scope.checkUserMultiCorrectChoice()’, and reflected in the ‘true’ value of the ‘$scope.results[6].correct’ property.

Advanced Quiz Results for Multiple-Correct Question

Advanced Quiz Results for Multiple-Correct Question

Filter

In the ‘quizAdvancedController.js’ controller, note the ‘filterFilter’ object injected into the controller’s main function. At the end of the controller, also note the ‘$scope.filterQuestion(questionId)’ method.

angular.module('quizModule')
  .controller('QuizAdvancedController',
  function ($scope, quizAdvancedFactory, filterFilter) {
    ...
    // find a specific question
    $scope.filteredQuestion = function (questionId) {
      return filterFilter($scope.quiz, {_id: questionId});
    };
  });

The ‘$scope.filterQuestion(questionId)’ method takes a question id as an input parameter, and returns that single question. The ‘$scope.filterQuestion(questionId)’ method actually returns a call to the angular.filter‘s filterFilter. It takes two parameters,  an array containing the entire set of questions (‘$scope.quiz’ array), and a ‘pattern object’ containing the specific ‘id’ to filter on (‘{_id: questionId}’).

The filter method is called by the three question-type partial view templates, for example ‘quiz-multi-choice.html’. For example, the partial view template, ‘quiz-advanced.html’, uses the ‘quiz-multichoice’ element to call the custom directive, ‘quizMultiChoiceDirective.js’, passing it a request for question id 4.

<h4 class="title">{{title}}</h4>
<br/>
<form name="quiz">
  <!--true-false-->
  <quiz-truefalse filter-by="1"></quiz-truefalse>
  <quiz-truefalse filter-by="2"></quiz-truefalse>
  <quiz-truefalse filter-by="3"></quiz-truefalse>

  <!--multi-choice-->
  <quiz-multichoice filter-by="4"></quiz-multichoice>
  <quiz-multichoice filter-by="5"></quiz-multichoice>
  <quiz-multichoice filter-by="6"></quiz-multichoice>

  <!--multi-correct-->
  <quiz-multicorrect filter-by="7"></quiz-multicorrect>
  <quiz-multicorrect filter-by="8"></quiz-multicorrect>
  <quiz-multicorrect filter-by="9"></quiz-multicorrect>
</form>

The custom directive, ‘quizMultiChoiceDirective.js’, loads the partial view template, ‘quiz-multi-choice.html’, using the ‘templateUrl’ argument. The ‘templateUrl’ argument uses ajax to load the template. The template, ‘quiz-multi-choice.html’, uses the ‘ng-repeat‘ directive to populate its section of the advanced quiz with question id 4 (div ng-repeat="question in $parent.filteredQuestion(filterBy)). It does so by calling filteredQuestion(4), in the ‘quizAdvancedController.js’ controller.

<div ng-repeat="question in $parent.filteredQuestion(filterBy)">
  <strong>{{question._id}}. {{question.question}}</strong>
  <div class="radio" ng-repeat="choice in question.choices">
    <input
        name="_id{{question._id}}"
        type="radio"
        value="{{choice._id}}"
        ng-model="question.userChoice"
        ng-change="$parent.$parent.$parent.checkUserChoice(question._id, choice._id)"/>
    <label for="_id{{question._id}}">{{choice.choice}}</label>
  </div>
  <div ng-include src="'/scripts/partials/quiz-choice-response.html'"></div>
</div>
<br/>

The ‘quiz-multi-choice.html’ template also loads the contents of the ‘choice-response.html’ template. This template contains DOM elements, common to all three question-type templates.

<div ng-if="$parent.$parent.$parent.results[question._id - 1].correct"
     class="result correct">
  <span class="glyphicon glyphicon-thumbs-up"></span>
  Correct!
</div>
<!--specify 'false' because not true (!) would include null (blank)-->
<div ng-if="$parent.$parent.$parent.results[question._id - 1].correct === false"
     class="result incorrect">
  <span class="glyphicon glyphicon-thumbs-down"></span>
  Incorrect
</div>

I have attempted to illustrate the filter in the diagram, below. I intentionally left out a few non-essential components to simplify the diagram, such as the main layout, config, route, service, other custom directives, and the JSON data file.

Using these techniques, we can easily extend the quiz, adding new answer types, such as ordering, matching, short-answer, and so forth.

Managing Scope

Being familiar with AngularJS, you should understand how scope works. You should know there is more than one scope, and that scope is normally inherited from the parent scope. Directives such as ng-repeat, ng-switch, ng-view, and ng-include, all create their own child scopes. Said better by AngularJS’s team, ‘in AngularJS, a child scope normally prototypically inherits from its parent scope. One exception to this rule is a directive that uses scope: { … } — this creates an isolate scope that does not prototypically inherit.‘ We use a number of directives. We also use ‘scope:’ within our custom directives for the advanced quiz example, which breaks the chain of inheritance.

In some of the code examples in this post, you will notice the use of ‘$parent‘, ‘$parent.$parent‘, or even ‘$parent.$parent.$parent‘, instead of simply ‘$scope‘. Sometimes, it necessary to reach outside the current scope, to a parent’s scope (‘$parent‘), or that parent’s parent’s scope (‘$parent.$parent‘). A simple example of this, in the partial view template, ‘quiz-multi-choice.htm’, we call ‘$parent.filteredQuestion(filterBy)‘. The ‘filteredQuestion(filterBy)’ method we need to use is in the parent scope of the template’s scope, so we call ‘$parent’ instead of ‘$scope’.

So how can you determine which scope contains the method or properties you are seeking? Batarang, the AngularJS WebInspector Extension for Chrome. Batarang adds an additional ‘AngularJS’ tab to Developer tools for Chrome. Previously, we were using the example of question id 4 with the AngularJS’s filter. Using the Batarang, below, we can see the question id 4 in the final View. Each question returned using the filter is contained within its own separate scope.

Question #4 in Batarang Models Tab

Question #4 in Batarang Models Tab

This example also shows how complex working with AngularJS’s scope(s) can be. Starting with a particular scope, using Batarang, you can visually move up (parent scope) or down (child scope) within the scope hierarchy. The contents of each scope, the Model, is displayed on the right. Batarang also offers several other feature, seen below, including AngularJS application performance and dependency visualization.

Links

Quiz Question Types (presentation)

Understanding Service Types (article)

Understanding Scopes (article)

Build custom directives with AngularJS (article)

Google I/O 2012 – Better Web App Development Through Tooling (YouTube video)

, , , , , , , , , , ,

Leave a comment

Cloud-based Continuous Integration and Deployment for .NET Development

Create a cloud-based, continuous integration and deployment toolchain for distributed .NET development teams, using GitHub, AppVeyor, and Microsoft Azure.

Introduction

Whether you are part of a large enterprise development environment, or a member of a small start-up, you are likely working with remote team members. You may be remote, yourself. Developers, testers, web designers, and other team members, commonly work remotely on software projects. Distributed teams, comprised of full-time staff, contractors, and third-party vendors, often work in different buildings, different cities, and even different countries.

If software is no longer strictly developed in-house, why should our software development and integration tools be located in-house? We live in a quickly evolving world of Saas, PaaS, and IaaS. Popular SaaS development tools include Visual Studio Online, GitHub, BitBucket, Travis-CI, AppVeyor, CloudBeesJIRA, AWS, Microsoft Azure, Nodejitsu, and Heroku, to name just a few. With all these ‘cord-cutting’ tools, there is no longer a need for distributed development teams to be tethered to on-premise tooling, via VPN tunnels and Remote Desktop Connections.

There are many combinations of hosted software development and integration tools available, depending on your technology stack, team size, and budget. In this post, we will explore one such toolchain for .NET development. Using GitGitHub, AppVeyor, and Microsoft Azure, we will continuously build, test, and deploy a multi-tier .NET solution, without ever leaving Visual Studio. This particular toolchain has strong integration between tools, and will scale to fit most development teams.

Git and GitHub
Git and GitHub are widely used in development today. Visual Studio 2013 has fully-integrated Git support and Visual Studio 2012 has supported Git via a plug-in since early last year. Git is fully compatible with Windows. Additionally, there are several third party tools available to manage Git and GitHub repositories on Windows. These include Git Bash (my favorite), Git GUI, and GitHub for Windows.

GitHub acts as a replacement for your in-house Git server. Developers commit code to their individual local Git project repositories. They then push, pull, and merge code to and from a hosted GitHub repository. For security, GitHub requires a registered username and password to push code. Data transfer between the local Git repository and GitHub is done using HTTPS with SSL certificates or SSH with public-key encryption. GitHub also offers two-factor authentication (2FA). Additionally, for those companies concerned about privacy and added security, GitHub offers private repositories. These plans range in price from $25 to $200 per month, currently.

GitHub View of Solution

GitHub View of Solution

AppVeyor
AppVeyor’s tagline is ‘Continuous Integration for busy developers’. AppVeyor automates building, testing and deployment of .NET applications. AppVeyor is similar to Jenkins and Hudson in terms of basic functionality, except AppVeyor is only provided as a SaaS. There are several hosted solutions in the continuous integration and delivery space similar to AppVeyor. They include CloudBees (hosted-Jenkins) and Travis-CI. While CloudBees and Travis CI works with several technology stacks, AppVeyor focuses specifically on .NET. Its closest competitor may be Microsoft’s new Visual Studio Online.

Identical to GitHub, AppVeyor also offers private repositories (spaces for building and testing code). Prices for private repositories currently range from $39 to $319 per month. Private repositories offer both added security and support.  AppVeyor integrates nicely with several cloud-based code repositories, including GitHub, BitBucket, Visual Studio Online, and Fog Creek’s Kiln.

AppVeyor View of Last Build of Solution

AppVeyor View of Latest Build of Solution

Azure
This post demonstrates continuous deployment from AppVeyor to a Microsoft Server 2012-based Azure VM. The VM has IIS 8.5, Web Deploy 3.5, IIS Web Management Service (WMSVC), and other components and configuration necessary to host the post’s sample Solution. AppVeyor would work just as well with Azure’s other hosting options, as well as other cloud-based hosting providers, such as AWS or Rackspace, which also supports the .NET stack.

New Microsoft Azure Portal View of VM

New Microsoft Azure Portal View of VM

Sample Solution

The Visual Studio Solution used for this post was originally developed as part of an earlier post, Consuming Cross-Domain WCF REST Services with jQuery using JSONP. The original Solution, from 2011, demonstrated jQuery’s AJAX capabilities to communicate with a RESTful WCF service, cross-domains, using JSONP. I have since updated and modernized the Solution for this post. The revised Solution is on a new branch (‘rev2014′) on GitHub. Major changes to the Solution include an upgrade from VS2010 to VS2013, the use of Git DVCS, NuGet package management, Web Publish Profiles, Web Essentials for bundling JS and CSS, Twitter Bootstrap, unit testing, and a lot of code refactoring.

Revised Restaurant Menu Demo Viewed on Android Tablet

Revised Restaurant Menu Demo Viewed on Android Tablet

The updated VS Solution contains the following four Projects:

  1. Restaurant – C# Class Library
  2. RestaurantUnitTests – Unit Test Project
  3. RestaurantWcfService – C# WCF Service Application
  4. RestaurantDemoSite – Web Site (JS/HTML5)
VS 2013 View of Solution

VS 2013 View of Solution

The Visual Studio Solution Explorer tab, here, shows all projects contained in the Solution, and the primary files and directories they contain.

As explained in the earlier post, the ‘RestaurantDemoSite’ web site makes calls to the ‘RestaurantWcfService’ WCF service. The WCF service exposes two operations, one that returns the menu (‘GetCurrentMenu’), and the other that accepts an order (‘SendOrder’). For simplicity, orders are stored in the files system as JSON files. No database is required for the Solution. All business logic is contained in the ‘Restaurant’ class library, which is referenced by the WCF service. This architecture is illustrated in this Visual Studio Assembly Dependencies Diagram.

Installing and Configuring the Solution

The README.md file in the GitHub repository contains instructions for installing and configuring this Solution. In addition, a set of PowerShell scripts, part of the Solution’s repository, makes the installation and configuration process, quick and easy. The scripts handle creating the necessary file directories and environment variables, setting file access permissions, and configuring IIS websites. Make sure to change the values of the environment variables before running the script. For reference, below are the contents of several of the supplied scripts. You should use the supplied scripts.

# Create environment variables
[Environment]::SetEnvironmentVariable("AZURE_VM_HOSTNAME", `
  "{YOUR HOSTNAME HERE}", "User")

[Environment]::SetEnvironmentVariable("AZURE_VM_USERNAME", `
  "{YOUR USERNME HERE}", "User")

[Environment]::SetEnvironmentVariable("AZURE_VM_PASSWORD", `
  "{YOUR PASSWORD HERE}", "User")

# Create new restaurant orders JSON file directory
$newDirectory = "c:\RestaurantOrders"

if (-not (Test-Path $newDirectory)){
  New-Item -Type directory -Path $newDirectory
}

$acl = Get-Acl $newDirectory
$ar = New-Object System.Security.AccessControl.FileSystemAccessRule(`
  "INTERACTIVE","Modify","ContainerInherit, ObjectInherit", "None", "Allow")
$acl.SetAccessRule($ar)
Set-Acl $newDirectory $acl

# Create new website directory
$newDirectory = "c:\RestaurantDemoSite"

if (-not (Test-Path $newDirectory)){
  New-Item -Type directory -Path $newDirectory
}

$acl = Get-Acl $newDirectory
$ar = New-Object System.Security.AccessControl.FileSystemAccessRule(`
  "IUSR","ReadAndExecute","ContainerInherit, ObjectInherit", "None", "Allow")
$acl.SetAccessRule($ar)
Set-Acl $newDirectory $acl

# Create new WCF service directory
$newDirectory = "c:\MenuWcfRestService"

if (-not (Test-Path $newDirectory)){
 New-Item -Type directory -Path $newDirectory
}

$acl = Get-Acl $newDirectory
$ar = New-Object System.Security.AccessControl.FileSystemAccessRule(`
 "IUSR","ReadAndExecute","ContainerInherit, ObjectInherit", "None", "Allow")
$acl.SetAccessRule($ar)

Set-Acl $newDirectory $acl
$ar = New-Object System.Security.AccessControl.FileSystemAccessRule(`
 "IIS_IUSRS","ReadAndExecute","ContainerInherit, ObjectInherit", "None", "Allow")
$acl.SetAccessRule($ar)
Set-Acl $newDirectory $acl

# Create main website in IIS
$newSite = "MenuWcfRestService"

if (-not (Test-Path IIS:\Sites\$newSite)){
  New-Website -Name $newSite -Port 9250 -PhysicalPath `
    c:\$newSite -ApplicationPool "DefaultAppPool"
}

# Create WCF service website in IIS
$newSite = "RestaurantDemoSite"

if (-not (Test-Path IIS:\Sites\$newSite)){
  New-Website -Name $newSite -Port 9255 -PhysicalPath `
    c:\$newSite -ApplicationPool "DefaultAppPool"
}

Cloud-Based Continuous Integration and Delivery

Webhooks
The first point of integration in our hosted toolchain is between GitHub and AppVeyor. In order for AppVeyor to work with GitHub, we use a Webhook. Webhooks are widely used to communicate events between systems, over HTTP. According to GitHub, ‘every GitHub repository has the option to communicate with a web server whenever the repository is pushed to. These webhooks can be used to update an external issue tracker, trigger CI builds, update a backup mirror, or even deploy to your production server.‘ Basically, we give GitHub permission to tell AppVeyor every time code is pushed to the GitHub. GitHub sends a HTTP POST to a specific URL, provided by AppVeyor. AppVeyor responds to the POST by cloning the GitHub repository, and building, testing, and deploying the Projects. Below is an example of a webhook for AppVeyor, in GitHub.

GitHub's AppVeyor Webhook Configuration

GitHub’s AppVeyor Webhook Configuration

Unit Tests
To help illustrate the use of AppVeyor for automated unit testing, the updated Solution contains a Unit Test Project. Every time code is committed to GitHub, AppVeyor will clone and build the Solution, followed by running the set of unit tests shown below. The project’s unit tests test the Restaurant class library (‘restaurant.dll’). The unit tests provide 100% code coverage, as shown in the Visual Studio Code Coverage Results tab, below:

Code Coverage Results for Restaurant Class Library

Code Coverage Results for Restaurant Class Library

AppVeyor runs the Solution’s automated unit tests using VSTest.Console.exe. VSTest.Console calls the unit test Project’s assembly (‘restaurantunittests.dll’).  As shown below, the VSTest command (in light blue) runs all tests, and then displays individual test results, a results summary, and the total test execution time.

AppVeyor Running Automated Unit Tests Using VSTest.Console

AppVeyor Running Automated Unit Tests Using VSTest.Console

VSTest.Console has several command line options similar to MSBuild. They can be adjusted to output various levels of feedback on test results. For larger projects, you can selectively choose which pre-defined test sets to run. Test sets needs are set-up in Solution, in advance.

Configuring Azure VM
Before we publish the Solution from AppVeyor to the Azure, we need to configure the VM. Again, we can use PowerShell to script most of the configuration. Most scripts are the same ones we used to configure our local environment. The README.md file in the GitHub repository contains instructions. The scripts handle creating the necessary file directories, setting file access permissions, configuring the IIS websites, creating the Web Deploy User account, and assigning it in IIS. For reference, below are the contents of several of the supplied scripts. You should use the supplied scripts.

# Create new restaurant orders JSON file directory
$newDirectory = "c:\RestaurantOrders"

if (-not (Test-Path $newDirectory)){
  New-Item -Type directory -Path $newDirectory
}

$acl = Get-Acl $newDirectory
$ar = New-Object System.Security.AccessControl.FileSystemAccessRule(`
  "INTERACTIVE","Modify","ContainerInherit, ObjectInherit", "None", "Allow")
$acl.SetAccessRule($ar)
Set-Acl $newDirectory $acl

# Create new website directory
$newDirectory = "c:\RestaurantDemoSite"

if (-not (Test-Path $newDirectory)){
  New-Item -Type directory -Path $newDirectory
}

$acl = Get-Acl $newDirectory
$ar = New-Object System.Security.AccessControl.FileSystemAccessRule(`
  "IUSR","ReadAndExecute","ContainerInherit, ObjectInherit", "None", "Allow")
$acl.SetAccessRule($ar)
Set-Acl $newDirectory $acl

# Create new WCF service directory
$newDirectory = "c:\MenuWcfRestService"

if (-not (Test-Path $newDirectory)){
 New-Item -Type directory -Path $newDirectory
}

$acl = Get-Acl $newDirectory
$ar = New-Object System.Security.AccessControl.FileSystemAccessRule(`
 "IUSR","ReadAndExecute","ContainerInherit, ObjectInherit", "None", "Allow")
$acl.SetAccessRule($ar)

Set-Acl $newDirectory $acl
$ar = New-Object System.Security.AccessControl.FileSystemAccessRule(`
 "IIS_IUSRS","ReadAndExecute","ContainerInherit, ObjectInherit", "None", "Allow")
$acl.SetAccessRule($ar)
Set-Acl $newDirectory $acl

# Create main website in IIS
$newSite = "MenuWcfRestService"

if (-not (Test-Path IIS:\Sites\$newSite)){
  New-Website -Name $newSite -Port 9250 -PhysicalPath `
    c:\$newSite -ApplicationPool "DefaultAppPool"
}

# Create WCF service website in IIS
$newSite = "RestaurantDemoSite"

if (-not (Test-Path IIS:\Sites\$newSite)){
  New-Website -Name $newSite -Port 9255 -PhysicalPath `
    c:\$newSite -ApplicationPool "DefaultAppPool"
}

# Create new local non-admin User and Group for Web Deploy

# Main variables (Change these!)
[string]$userName = "USER_NAME_HERE" # mjones
[string]$fullName = "FULL USER NAME HERE" # Mike Jones
[string]$password = "USER_PASSWORD_HERE" # pa$$w0RD!
[string]$groupName = "GROUP_NAME_HERE" # Development

# Create new local user account
[ADSI]$server = "WinNT://$Env:COMPUTERNAME"
$newUser = $server.Create("User", $userName)
$newUser.SetPassword($password)

$newUser.Put("FullName", "$fullName")
$newUser.Put("Description", "$fullName User Account")

# Assign flags to user
[int]$ADS_UF_PASSWD_CANT_CHANGE = 64
[int]$ADS_UF_DONT_EXPIRE_PASSWD = 65536
[int]$COMBINED_FLAG_VALUE = 65600

$flags = $newUser.UserFlags.value -bor $COMBINED_FLAG_VALUE
$newUser.put("userFlags", $flags)
$newUser.SetInfo()

# Create new local group
$newGroup=$server.Create("Group", $groupName)
$newGroup.Put("Description","$groupName Group")
$newGroup.SetInfo()

# Assign user to group
[string]$serverPath = $server.Path
$group = [ADSI]"$serverPath/$groupName, group"
$group.Add("$serverPath/$userName, user")

# Assign local non-admin User in IIS for Web Deploy
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Web.Management")
[Microsoft.Web.Management.Server.ManagementAuthorization]::Grant(`
  $userName, "$Env:COMPUTERNAME\MenuWcfRestService", $FALSE)
[Microsoft.Web.Management.Server.ManagementAuthorization]::Grant(`
  $userName, "$Env:COMPUTERNAME\RestaurantDemoSite", $FALSE)

Publish Profiles
The second point of integration in our toolchain is between AppVeyor and the Azure VM. We will be using Microsoft’s Web Deploy to deploy our Solution from AppVeyor to Azure.  Web Deploy integrates with the IIS Web Management Service (WMSVC) for remote deployment by non-administrators. I have already configured Web Deploy and created a non-administrative user on the Azure VM. This user’s credentials will be used for deployments. These are the credentials in the username and password environment variables we created.

To continuously deploy to Azure, we will use Web Publish Profiles with Microsoft’s Web Deploy technology. Both the website and WCF service projects contain individual profiles for local development (‘LocalMachine’), as well as deployment to Azure (‘AzureVM’). The ‘AzureVM’ profiles contain all the configuration information AppVeyor needs to connect to the Azure VM and deploy the website and WCF service.

The easiest way to create a profile is by right-clicking on the project and selecting the ‘Publish…’ and ‘Publish Web Site’ menu items. Using the Publish Web wizard, you can quickly build and validate a profile.

Publish Web Profile Tab

Publish Web Profile Tab

Each profile in the above Profile drop-down, represents a ‘.pubxml’ file. The Publish Web wizard is merely a visual interface to many of the basic configurable options found in the Publish Profile’s ‘.pubxml’ file. The .pubxml profile files can be found in the Project Explorer. For the website, profiles are in the ‘App_Data’ directory (i.e. ‘Restaurant\RestaurantDemoSite\App_Data\PublishProfiles\AzureVM.pubxml’). For the WCF service, profiles are in the ‘Properties’ directory (i.e. ‘Restaurant\RestaurantWcfService\Properties\PublishProfiles\AzureVM.pubxml’).

As an example, below are the contents of the ‘LocalMachine’ profile for the WCF service (‘LocalMachine.pubxml’). This is about as simple as a profile gets. Note since we are deploying locally, the profile is configured to open the main page of the website in a browser, after deployment; a helpful time-saver during development.

<?xml version="1.0" encoding="utf-8"?>
<!--
This file is used by the publish/package process of your Web project.
You can customize the behavior of this process by editing this MSBuild file.
In order to learn more about this please visit http://go.microsoft.com/fwlink/?LinkID=208121.
-->
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <PropertyGroup>
        <WebPublishMethod>FileSystem</WebPublishMethod>
        <LastUsedBuildConfiguration>Debug</LastUsedBuildConfiguration>
        <LastUsedPlatform>Any CPU</LastUsedPlatform>
        <SiteUrlToLaunchAfterPublish>http://localhost:9250/RestaurantService.svc/help</SiteUrlToLaunchAfterPublish>
        <LaunchSiteAfterPublish>True</LaunchSiteAfterPublish>
        <ExcludeApp_Data>True</ExcludeApp_Data>
        <publishUrl>C:\MenuWcfRestService</publishUrl>
        <DeleteExistingFiles>True</DeleteExistingFiles>
    </PropertyGroup>
</Project>

A key change we will make is to use environment variables in place of sensitive configuration values in the ‘AzureVM’ Publish Profiles. The Web Publish wizard does not allow this change. To do this, we must edit the ‘AzureVM.pubxml’ file for both the website and the WCF service. We will replace the hostname of the server where we will deploy the projects with a variable (i.e. AZURE_VM_HOSTNAME = ‘MyAzurePublicServer.net’). We will also replace the username and password used to access the deployment destination. This way, someone accessing the Solution’s source code, won’t be able to obtain any sensitive information, which would give them the ability to hack your site. Note the use of the ‘AZURE_VM_HOSTNAME’ and ‘AZURE_VM_USERNAME’ environment variables, show below.

<?xml version="1.0" encoding="utf-8"?>
<!--
This file is used by the publish/package process of your Web project.
You can customize the behavior of this process by editing this MSBuild file.
In order to learn more about this please visit http://go.microsoft.com/fwlink/?LinkID=208121.
-->
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <PropertyGroup>
        <WebPublishMethod>MSDeploy</WebPublishMethod>
        <LastUsedBuildConfiguration>AppVeyor</LastUsedBuildConfiguration>
        <LastUsedPlatform>Any CPU</LastUsedPlatform>
        <SiteUrlToLaunchAfterPublish />
        <LaunchSiteAfterPublish>False</LaunchSiteAfterPublish>
        <ExcludeApp_Data>True</ExcludeApp_Data>
        <MSDeployServiceURL>https://$(AZURE_VM_HOSTNAME):8172/msdeploy.axd</MSDeployServiceURL>
        <DeployIisAppPath>MenuWcfRestService</DeployIisAppPath>
        <RemoteSitePhysicalPath />
        <SkipExtraFilesOnServer>False</SkipExtraFilesOnServer>
        <MSDeployPublishMethod>WMSVC</MSDeployPublishMethod>
        <EnableMSDeployBackup>True</EnableMSDeployBackup>
        <UserName>$(AZURE_VM_USERNAME)</UserName>
        <_SavePWD>False</_SavePWD>
        <_DestinationType>AzureVirtualMachine</_DestinationType>
    </PropertyGroup>
</Project>

The downside of adding environment variables to the ‘AzureVM’ profiles, the Publish Profile wizard feature within Visual Studio will no longer allow us to deploy, using the ‘AzureVM’ profiles. As demonstrated below, after substituting variables for actual values, the ‘Server’ and ‘User name’ values will no longer display properly. We can confirm this by trying to validate the connection, which fails. This does not indicate your environment variable values are incorrect, only that Visual Studio can longer correctly parse the ‘AzureVM.pubxml’ file and display it properly in the IDE. No big deal…

Publish Web Connection Tab - Failed Validation

Publish Web Connection Tab – Failed Validation

We can use the command line or PowerShell to deploy with the ‘AzureVM’ profiles.  AppVeyor accepts both command line input, as well as PowerShell for most tasks. All examples in this post and in the GitHub repository use PowerShell.

To build and deploy (publish) to Azure from the command line or PowerShell, we will use MSBuild. Below are the MSBuild commands used by AppVeyor to build our Solution, and then deploy our Solution to Azure. The first two MSBuild commands build the WCF service and the website. The second two deploy them to Azure. There are several ways you could construct these commands to successfully build and deploy this Solution. I found these commands to be the most succinct. I have split the build and the deploy functions so that the AppVeyor can run the automated unit tests, in between. If the tests don’t pass, we don’t want to deploy the code.

# Build WCF service
# (AppVeyor config ignores website Project in Solution)
msbuild Restaurant\Restaurant.sln `
 /p:Configuration=AppVeyor /verbosity:minimal /nologo

# Build website
msbuild Restaurant\RestaurantDemoSite\website.publishproj `
 /p:Configuration=Release /verbosity:minimal /nologo

Write-Host "*** Solution builds complete."
# Deploy WCF service
# (AppVeyor config ignores website Project in Solution)
msbuild Restaurant\Restaurant.sln `
 /p:DeployOnBuild=true /p:PublishProfile=AzureVM /p:Configuration=AppVeyor `
 /p:AllowUntrustedCertificate=true /p:Password=$env:AZURE_VM_PASSWORD `
 /verbosity:minimal /nologo

# Deploy website
msbuild Restaurant\RestaurantDemoSite\website.publishproj `
 /p:DeployOnBuild=true /p:PublishProfile=AzureVM /p:Configuration=Release `
 /p:AllowUntrustedCertificate=true /p:Password=$env:AZURE_VM_PASSWORD `
 /verbosity:minimal /nologo

Write-Host "*** Solution deployments complete."

Below is the output from AppVeyor showing the WCF Service and website’s deployment to Azure. Deployment is the last step in the continuous delivery process. At this point, the Solution was already built and the automated unit tests completed, successfully.

AppVeyor Output from Deployments to Azure.

AppVeyor Output from Deployments to Azure.

Below is the final view of the sample Solution’s WCF service and web site deployed to IIS 8.5 on the Azure VM.

Final View of IIS Sites Running on Azure VM

Final View of IIS Sites Running on Azure VM

Links

 

, , , , , , , , , , , , , , , , , ,

1 Comment

Single Page Web Applications, Book Review

A brief review of ‘Single Page Web Applications’, by authors Michael S. Mikowski and Josh C. Powell. Learn to build modern browser-based apps, using the latest full-stack JavaScript technologies.

Recently, I had the opportunity to review the eBook edition of ‘Single Page Web Applications‘, by authors Michael S. Mikowski and Josh C. Powell, published by Manning Publications. Most of us involved in software development are acutely aware of recent explosion of the interest in full-stack JavaScript applications, NoSQL databases, HTML5/CCS3, web-sockets, and single-page web applications (SPAs). Mikowski and Powell’s book, Single Page Web Applications, hit the market at a perfect time (release last September), and with just the right mix of timely learning opportunities for the reader.

An interesting twist on many current books in this category, the lack of the author’s heavy reliance on one or more popular JavaScript libraries, such as AngularJS,  Ember.js, and Backbone.js. Mikowski and Powell purposefully build a JavaScript-based SPA from the ground up, without simply plugging into a ready-made library or API. Although many readers may be heavily tied to a certain library or API, understanding how to build a SPA from the ground up is invaluable.

The first thing that struck me, the thoroughness of the book’s examples. A question many publishers ask, does the book have enough ‘real-word examples’. Sadly, the answer is often no. Many books only offer incomplete, academic examples. They are often difficult to scale to match the complexity of modern software development. However in this case, I felt Mikowski and Powell’s book hit a home run with their ‘real-world’ code samples. It is obvious both authors are working professionals, doing development in the ‘real world’. The book’s samples build upon one another throughout the book, effectively expanding the application’s scope and the user’s knowledge.

The second attribute that stood out to me, the book’s documentation. In fact, that might have been one of the very few minor negatives I found with the book — to many comments. The authors go to great lengths to thoroughly comment and document the code samples. In some examples, almost obscuring the code itself. I found the comments both detailed and helpful.

The third attribute that stood out to me, the author’s focus on testing. Testing the sample applications is highlighted throughout the book. Additionally, Appendix B, ‘Testing a SPA’, had more information on testing complex JavaScript applications than many other books I have read. Testing software is often ignored in books and training materials. However, software testing is an integral part of the ‘real-world’ software development life-cycle. Testing is critical to software’s success.

Lastly, I found a lot of value in Appendix A, ‘JavaScript coding standard‘. Read this part, first! Anyone can follow along with the book, mimicking code samples, without really understanding JavaScript’s core concepts. Without a real understanding, it is hard to apply the book’s lessons to your own application. I felt the JavaScript overview in Appendix A of Mikowski and Powell’s book was one of the best I have read. I will be referring back to appendix’s coding style guide, in the future.

, , , , , ,

Leave a comment

Windows PowerShell 4.0 for .NET Developers, Book Review

A brief review of ‘Windows PowerShell 4.0 for .NET Developers’, a fast-paced PowerShell guide, enabling you to efficiently administer and maintain your development environment.

Windows PowerShell 4.0 for .NET Developers

Introduction

Recently, I had the opportunity to review ‘Windows PowerShell 4.0 for .NET Developers‘, published by Packt Publishing. According to its author, Sherif Talaat, the book is ‘a fast-paced PowerShell guide, enabling you to efficiently administer and maintain your development environment.‘ Working in a large and complex software development organization, technologies such as PowerShell, which enable increased speed and automation, are essential to our success. Having used PowerShell on a regular basis as a .NET developer for the past few years, I was excited to see what Sherif’s newest book offered.

Requirements

The book recommends the following minimal software configuration to work through the code samples:

  • Windows Server 2012 R2 (includes PowerShell 4.0 and .NET 4.5)
  • SQL Server 2012
  • Visual Studio 2012/2013
  • Visual Studio Team Foundation Server (TFS) 2012/2013

To test the book’s samples, I provisioned a fresh VM, and using my MSDN subscription, installed the required Windows Server, SQL Server, and Team Foundation Server. I worked directly on the VM, as well as remotely from a Windows 7 Enterprise-based development machine with Visual Studio 2012 installed. The code samples worked fairly well, with only a few minor problems I found. There is still no errata published for the book as of the time of review.

A key aspect many authors do not address, is the complexities of using PowerShell in a corporate environment. Working individually or on a small network, developers don’t always experience the added burden of restrictive network security, LDAP, proxy servers, proxy authentication, XML gateways, firewalls, and centralized computer administration. Any code that requires access to remote servers and systems, often requires additional coding to work within a corporate environment. It can be frustrating to debug and extend simple examples to work successfully within an enterprise setting.

Contents

Windows PowerShell 4.0 for .NET Developers, at 115 pages in length, is divided into five chapters:

  • Chapter 1: Getting Started with Windows PowerShell
  • Chapter 2: Unleashing Your Development Skills with PowerShell
  • Chapter 3: PowerShell for Your Daily Administration Tasks
  • Chapter 4: PowerShell and Web Technologies
  • Chapter 5: PowerShell and Team Foundation Server

Chapter 1 provides a brief introduction to PowerShell. At a scant 30 pages, I would not recommend this book as a way to learn PowerShell for the beginner. For learning PowerShell, I recommend Instant Windows PowerShell, by Vinith Menon, also published by Packt Publishing. Alternatively, I recommend a few books by Manning Publications, including Learn Windows PowerShell in a Month of Lunches, Second Edition.

Chapter 2 discusses PowerShell in relationship to several key Microsoft technologies, including Windows Management Instrumentation (WMI), Common Information Model (CIM), Component Object Model (COM) and Extensible Markup Language (XML). As a .NET developer, it’s almost impossible not to have worked with one, or all of these technologies. Chapter 2 discusses how PowerShell works with .NET objects, and extend the .NET framework. The chapter also includes an easy-to-follow example of creating, importing, and calling a PowerShell binary module (compiled .NET class library), using Visual Studio.

Chapter 3 explores areas where .NET developer can start leveraging PowerShell for daily administrative tasks. In particular, I found the sections on PowerShell Remoting and administering IIS and SQL Server particularly useful. Being able to easily connect to remote web, application, and database servers from the command line (or, PowerShell prompt) and do basic system administration is a huge time savings in an agile development environment.

Chapters 4 focuses on how PowerShell interfaces with SOAP and REST based services, web requests, and JSON. Windows Communication Foundation (WCF) based service-oriented application development has been a trend for the last few years. Being able to manage, test, and monitor SOAP and RESTful services and HTTP requests/responses is important to .NET developers. PowerShell can often quicker and easier than writing and compiling service utilities in Visual Studio, or using proprietary third-party applications.

Chapter 5 is dedicated to Visual Studio Team Foundation Server (TFS), Microsoft’s end-to-end, Application Lifecycle Management (ALM) solution. Chapter 5 details the installation and use of TFS Power Tools and TFS PowerShell snap-in. Having held the roles of lead developer and Scrum Master, I have personally found some of the best uses for PowerShell in automating various aspects of TFS. Managing TFS often requires repetitive tasks, the place where PowerShell excels. You will need to explore additional resources beyond the scope of this book to really start automating TFS with PowerShell.

Conclusion

Overall, I enjoyed the book and felt it was well worth the time to explore. I applaud Sherif for targeting a PowerShell book specifically to developers. Due to its short length, the book did leave me wanting more information on a few subjects that were barely skimmed. I also found myself expecting guidance on a few subjects the book did not touch upon, such as PowerShell for cloud-based development (Azure), test automation, and build and deployment automation. For more information on some of those subjects, I recommend Sherif’s other book, also published by Packt Publishing, PowerShell 3.0 Advanced Administration Handbook.

, , , , , , , ,

Leave a comment

Retrieving and Displaying Data with AngularJS and the MEAN Stack: Part II

Explore various methods of retrieving and displaying data using AngularJS and the MEAN Stack.

Mobile View on Android Smartphone

Mobile View on Android Smartphone

Introduction

In this two-part post, we are exploring methods of retrieving and displaying data using AngularJS and the MEAN Stack. The post’s corresponding GitHub project, ‘meanstack-data-samples‘, is based on William Lepinski’s ‘generator-meanstack‘, which is in turn is based on Yeoman’s ‘generator-angular‘. As a bonus, since both projects are based on ‘generator-angular’, all the code generators work. Generators can save a lot of time and aggravation when building AngularJS components.

In part one of this post, we installed and configured the ‘meanstack-data-samples’ project from GitHub. In part two, we will we will look at five examples of retrieving and displaying data using AngularJS:

  • Function within AngularJS Controller returns array of strings.
  • AngularJS Service returns an array of simple object literals to the controller.
  • AngularJS Factory returns the contents of JSON file to the controller.
  • AngularJS Factory returns the contents of JSON file to the controller using a resource object
    (In GitHub project, but not discussed in this post).
  • AngularJS Factory returns a collection of documents from MongoDB Database to the controller.
  • AngularJS Factory returns results from Google’s RESTful Web Search API to the controller.

Project Structure

For brevity, I have tried to limit the number of files in the project. There are two main views, both driven by a single controller. The primary files, specific to data retrieval and display, are as follows:

  • Default site file (./)
    • index.html – loads all CSS and JavaScript files, and views
  • App and Routes (./app/scripts/)
    • app.js – instantiates app and defines routes (route/view/controller relationship)
  • Views (./app/views/)
    • data-bootstrap.html – uses Twitter Bootstrap
    • data-no-bootstrap.html – basically the same page, without Twitter Bootstrap
  • Controllers (./app/scripts/controllers/)
    • DataController.js (DataController) – single controller used by both views
  • Services and Factories (./app/scripts/services/)
    • meanService.js (meanService) – service returns array of object literals to DataController
    • jsonFactory.js (jsonFactory) – factory returns contents of JSON file
    • jsonFactoryResource.js (jsonFactoryResource) – factory returns contents of JSON file using resource object (new)
    • mongoFactory.js (mongoFactory) – factory returns MongoDB collection of documents
    • googleFactory.js (googleFactory) – factory call Google Web Search API
  • Models (./app/models/)
    • Components.js – mongoose constructor for the Component schema definition
  • Routes (./app/)
    • routes.js – mongoose RESTful routes
  • Data (./app/data/)
    • otherStuff.json – static JSON file loaded by jsonFactory
  • Environment Configuration (./config/environments/)
    • index.js – defines all environment configurations
    • test.js – Configuration specific to the current ‘test’ environment
  • Unit Tests (./test/spec/…)
    • Various files – all controller and services/factories unit test files are in here…
Project in JetBrains WebStorm 8.0

Project in JetBrains WebStorm 8.0

There are many more files, critical to the project’s functionality, include app.js, Gruntfile.js, bower.json, package.json, server.js, karma.conf.js, and so forth. You should understand each of these file’s purposes.

Function Returns Array

In the first example, we have the yeomanStuff() method, a member of the $scope object, within the DataController.  The yeomanStuff() method return an array object containing three strings. In JavaScript, a method is a function associated with an object.

$scope.yeomanStuff = function () {
  return [
    'yo',
    'Grunt',
    'Bower'
  ];
};
'yeomanStuff' Method of the '$scope' Object

‘yeomanStuff’ Method of the ‘$scope’ Object

The yeomanStuff() method is called from within the view by Angular’s ng-repeat directive. The directive, ng-repeat, allows us to loop through the array of strings and add them to an unordered list. We will use ng-repeat for all the examples in this post.

<ul class="list-group">
  <li class="list-group-item"
	  ng-repeat="stuff in yeomanStuff()">
	{{stuff}}
  </li>
<ul>

Method1

Although this first example is easy to implement, it is somewhat impractical. Generally, you would not embed static data into your code. This limits your ability to change the data, independent of a application’s code. In addition, the function is tightly coupled to the controller, limiting its reuse.

Service Returns Array

In the second example, we also use data embedded in our code. However, this time we have improved the architecture slightly by moving the data to an Angular Service. The meanService contains the getMeanStuff() function, which returns an array containing four object literals. Using a service, we can call the getMeanStuff() function from anywhere in our project.

angular.module('generatorMeanstackApp')
  .service('meanService', function () {
    this.getMeanStuff = function () {
      return ([
        {
          component: 'MongoDB',
          url: 'http://www.mongodb.org'
        },
        {
          component: 'Express',
          url: 'http://expressjs.com'
        },
        {
          component: 'AngularJS',
          url: 'http://angularjs.org'
        },
        {
          component: 'Node.js',
          url: 'http://nodejs.org'
        }
      ])
    };
  });

Within the DataController, we assign the array object, returned from the meanService.getMeanStuff() function, to the meanStuff object property of the  $scope object.

$scope.meanStuff = {};
try {
  $scope.meanStuff = meanService.getMeanStuff();
} catch (error) {
  console.error(error);
}
'meanStuff' Property of the '$scope' Object

‘meanStuff’ Property of the ‘$scope’ Object

The meanStuff object property is accessed from within the view, using ng-repeat. Each object in the array contains two properties, component and url. We display the property values on the page using Angular’s double curly brace expression notation (i.e. ‘{{stuff.component}}‘).

<ul class="nav nav-pills nav-stacked">
  <li ng-repeat="stuff in meanStuff">
    url}}"
       target="_blank">{{stuff.component}}
  </li>
<ul>

Method2

Promises, Promises…

The remaining methods implement an asynchronous (non-blocking) programming model, using the $http and $q services of Angular’s ng module. The services implements the asynchronous Promise and Deferred APIs. According to Chris Webb, in his excellent two-part post, Promise & Deferred objects in JavaScript: Theory and Semantics, a promise represents a value that is not yet known and a deferred represents work that is not yet finished. I strongly recommend reading Chris’ post, before continuing. I also highly recommend watching RED Ape EDU’s YouTube video, Deferred and Promise objects in Angular js. This video really clarified the promise and deferred concepts for me.

Factory Loads JSON File

In the third example, we will read data from a JSON file (‘./app/data/otherStuff.json‘) using an AngularJS Factory. The differences between a service and a factory can be confusing, and are beyond the scope of this post. Here is two great links on the differences, one on Angular’s site and one on StackOverflow.

{
  "components": [
    {
      "component": "jQuery",
      "url": "http://jquery.com"
    },
    {
      "component": "Jade",
      "url": "http://jade-lang.com"
    },
    {
      "component": "JSHint",
      "url": "http://www.jshint.com"
    },
    {
      "component": "Karma",
      "url": "http://karma-runner.github.io"
    },
    ...
  ]
}

The jsonFactory contains the getOtherStuff() function. This function uses $http.get() to read the JSON file and returns a promise of the response object. According to Angular’s site, “since the returned value of calling the $http function is a promise, you can also use the then method to register callbacks, and these callbacks will receive a single argument – an object representing the response. A response status code between 200 and 299 is considered a success status and will result in the success callback being called. ” As I mentioned, a complete explanation of the deferreds and promises, is too complex for this short post.

angular.module('generatorMeanstackApp')
  .factory('jsonFactory', function ($q, $http) {
    return {
      getOtherStuff: function () {
        var deferred = $q.defer(),
          httpPromise = $http.get('data/otherStuff.json');

        httpPromise.then(function (response) {
          deferred.resolve(response);
        }, function (error) {
          console.error(error);
        });

        return deferred.promise;
      }
    };
  });

The response object contains the data property. Angular defines the response object’s data property as a string or object, containing the response body transformed with the transform functions. One of the properties of the data property is the components array containing the seven objects. Within the DataController, if the promise is resolved successfully, the callback function assigns the contents of the components array to the otherStuff property of the $scope object.

$scope.otherStuff = {};
jsonFactory.getOtherStuff()
  .then(function (response) {
    $scope.otherStuff = response.data.components;
  }, function (error) {
    console.error(error);
  });
'otherStuff' Property of the '$scope' Object

‘otherStuff’ Property of the ‘$scope’ Object

The otherStuff property is accessed from the view, using ng-repeat, which displays individual values, exactly like the previous methods.

<ul class="nav nav-pills nav-stacked">
  <li ng-repeat="stuff in otherStuff">
    <a href="{{stuff.url}}"
       target="_blank">{{stuff.component}}</a>
  </li>
</ul>

Method3

This method of reading a JSON file is often used for configuration files. Static configuration data is stored in a JSON file, external to the actual code. This way, the configuration can be modified without requiring the main code to be recompiled and deployed. It is a technique used by the components within this very project. Take for example the bower.json files and the package.json files. Both contain configuration data, stored as JSON, used by Bower and npm to perform package management.

Factory Retrieves Data from MongoDB

In the fourth example, we will read data from a MongoDB database. There are a few more moving parts in this example than in the previous examples. Below are the documents in the components collection of the meanstack-test MongoDB database, which we will retrieve and display with this method.  The meanstack-test database is defined in the test.js environments file (discussed in part one).

'meanstack-test' Database's 'components' Collection Documents

‘meanstack-test’ Database’s ‘components’ Collection Documents

To connect to the MongoDB, we will use Mongoose. According to their website, “Mongoose provides a straight-forward, schema-based solution to modeling your application data and includes built-in type casting, validation, query building, business logic hooks and more, out of the box.” But wait, MongoDB is schemaless? It is. However, Mongoose provides a schema-based API for us to work within. Again, according to Mongoose’s website, “Everything in Mongoose starts with a Schema. Each schema maps to a MongoDB collection and defines the shape of the documents within that collection.

In our example, we create the componentSchema schema, and pass it to the Component model (the ‘M’ in MVC). The componentSchema maps to the database’s components collection.

var mongoose = require('mongoose');
var Schema = mongoose.Schema;

var componentSchema = new Schema({
  component: String,
  url: String
});

module.exports = mongoose.model('Component', componentSchema);

The routes.js file associates routes (Request URIs) and HTTP methods to Mongoose actions. These actions are usually CRUD operations. In our simple example, we have a single route, ‘/api/components‘, associated with an HTTP GET method. When an HTTP GET request is made to the ‘/api/components‘ request URI, Mongoose calls the Model.find() function, ‘Component.find()‘, with a callback function parameter. The Component.find() function returns all documents in the components collection.

var Component = require('./models/component');

module.exports = function (app) {
  app.get('/api/components', function (req, res) {
    Component.find(function (err, components) {
      if (err)
        res.send(err);

      res.json(components);
    });
  });
};

You can test these routes, directly. Below, is the results of calling the ‘/api/components‘ route in Chrome.

Response from MongoDB Using Mongoose

Response from MongoDB Using Mongoose

The mongoFactory contains the getMongoStuff() function. This function uses $http.get() to call  the ‘/api/components‘ route. The route is resolved by the routes.js file, which in turn executes the Component.find() command. The promise of an array of objects is returned by the getMongoStuff() function. Each object represents a document in the components collection.

angular.module('generatorMeanstackApp')
  .factory('mongoFactory', function ($q, $http) {
    return {
      getMongoStuff: function () {
        var deferred = $q.defer(),
          httpPromise = $http.get('/api/components');

        httpPromise.success(function (components) {
          deferred.resolve(components);
        })
          .error(function (error) {
            console.error('Error: ' + error);
          });

        return deferred.promise;
      }
    };
  });

Within the DataController, if the promise is resolved successfully, the callback function assigns the array of objects, representing the documents in the collection, to the mongoStuff property of the $scope object.

$scope.mongoStuff = {};
mongoFactory.getMongoStuff()
  .then(function (components) {
    $scope.mongoStuff = components;
  }, function (error) {
    console.error(error);
  });
'mongoStuff' Property of the '$scope' Object

‘mongoStuff’ Property of the ‘$scope’ Object

The mongoStuff property is accessed from the view, using ng-repeat, which displays individual values using Angular expressions, exactly like the previous methods.

<ul class="list-group">
  <li class="list-group-item" ng-repeat="stuff in mongoStuff">
    <b>{{stuff.component}}</b>
    <div class="text-muted">{{stuff.description}}</div>
  </li>
</ul>

Method4

Factory Calls Google Search

In the last example, we will call the Google Web Search API from an AngularJS Factory. The Google Web Search API exposes a simple RESTful interface. According to Google, “in all cases, the method supported is GET and the response format is a JSON encoded result set with embedded status codes.” Google describes this method of using RESTful access to the API, as “for Flash developers, and those developers that have a need to access the Web Search API from other Non-JavaScript environment.” However, we will access it in our JavaScript-based MEAN stack application, due to the API’s ease of implementation.

Note according to Google’s site, “the Google Web Search API has been officially deprecated…it will continue to work…but the number of requests…will be limited. Therefore, we encourage you to move to Custom Search, which provides an alternative solution.Google Search, or more specifically, the Custom Search JSON/Atom API, is a newer API, but the Web Search API is easier to demonstrate in this brief post than Custom Search JSON/Atom API, which requires the use of an API key.

The googleFactory contains the getSearchResults() function. This function uses $http.jsonp() to call the Google Web Search API RESTful interface and return the promise of the JSONP-formatted (‘JSON with padding’) response. JSONP provides cross-domain access to a JSON payload, by wrapping the payload in a JavaScript function call (callback).

angular.module('generatorMeanstackApp')
  .factory('googleFactory', function ($q, $http) {
    return {
      getSearchResults: function () {
        var deferred = $q.defer(),
          host = 'https://ajax.googleapis.com/ajax/services/search/web',
          args = {
            'version': '1.0',
            'searchTerm': 'mean%20stack',
            'results': '8',
            'callback': 'JSON_CALLBACK'
          },
          params = ('?v=' + args.version + '&q=' + args.searchTerm + '&rsz=' +
            args.results + '&callback=' + args.callback),
          httpPromise = $http.jsonp(host + params);

        httpPromise.then(function (response) {
          deferred.resolve(response);
        }, function (error) {
          console.error(error);
        });

        return deferred.promise;
      }
    };
  });

The getSearchResults() function uses the HTTP GET method to make an HTTP request the following RESTful URI:
https://ajax.googleapis.com/ajax/services/search/web?v=1.0&q=mean%20stack&rsz=8&callback=angular.callbacks._0

Using Google Chrome’s Developer tools, we can preview the Google Web Search JSONP-format HTTP response (abridged). Note the callback function that wraps the JSON payload.

Google Web Search Results in Chrome Browser

Google Web Search Results in Chrome Browser

Within the DataController, if the promise is resolved successfully, our callback function returns the response object. The response object contains a lot of information. We are able to limit that amount of information sent to the view by only assigning the actual search results, an array of eight objects contained in the response object, to the googleStuff property of the $scope object.

$scope.googleStuff = {};
googleFactory.getSearchResults()
  .then(function (response) {
    $scope.googleStuff = response.data.responseData.results;
  }, function (error) {
    console.error(error);
  });

Below is the full response returned by the The googleFactory. Note the path to the data we are interested in: ‘response.data.responseData.results‘.

Google Search Response Object

Google Search Response Object

Below is the filtered results assigned to the googleStuff property:

'googleStuff' Property of the '$scope' Object

‘googleStuff’ Property of the ‘$scope’ Object

The googleStuff property is accessed from the view, using ng-repeat, which displays individual values using Angular expressions, exactly like the previous methods.

<ul class="list-group">
  <li class="list-group-item"
      ng-repeat="stuff in googleStuff">
    <a href="{{unescapedUrl.url}}"
       target="_blank"><b>{{stuff.visibleUrl}}</b></a>

    <div class="text-muted">{{stuff.titleNoFormatting}}</div>
  </li>
</ul>

Method5

Links

, , , , , , , , , , , , , , , , , , , , ,

Leave a comment

Retrieving and Displaying Data with AngularJS and the MEAN Stack: Part I

Explore various methods of retrieving and displaying data using AngularJS and the MEAN Stack.

Mobile View of Application on Android Smartphone

Mobile View of Application on Android Smartphone

Introduction

In the following two-part post, we will explore several methods of retrieving and displaying data using AngularJS and the MEAN Stack. The post’s corresponding GitHub project, ‘meanstack-data-samples‘, is based on William Lepinski’s ‘generator-meanstack‘, which is in turn is based on Yeoman’s ‘generator-angular‘. As a bonus, since both projects are based on ‘generator-angular’, all the code generators work. Generators can save a lot of time and aggravation when building AngularJS components.

In part one of this post, we will install and configure the ‘meanstack-data-samples’ project from GitHub, which corresponds to this post. In part two, we will we will look at several methods for retrieving and displaying data using AngularJS:

  • Function within AngularJS Controller returns array of strings.
  • AngularJS Service returns an array of simple object literals to the controller.
  • AngularJS Factory returns the contents of JSON file to the controller.
  • AngularJS Factory returns the contents of JSON file to the controller using a resource object
    (In GitHub project, but not discussed in this post).
  • AngularJS Factory returns a collection of documents from MongoDB Database to the controller.
  • AngularJS Factory returns results from Google’s RESTful Web Search API to the controller.

Preparation

If you need help setting up your development machine to work with the MEAN stack, refer to my last post, Installing and Configuring the MEAN Stack, Yeoman, and Associated Tooling on Windows. You will need to install all the MEAN and Yeoman components.

For this post, I am using JetBrains’ new WebStorm 8RC to build and demonstrate the project. There are several good IDE’s for building modern web applications; WebStorm is one of the current favorites of developers.

Complexity of Modern Web Applications

Building modern web applications using the MEAN stack or comparable technologies is complex. The ‘meanstack-data-samples’ project, and the projects it is based on, ‘generator-meanstack’ and ‘generator-angular’, have dozens of moving parts. In this simple project, we have MongoDBExpressJSAngularJS, Node.js, yoGrunt, BowerGitjQueryTwitter BootstrapKarmaJSHint, jQueryMongoose, and hundreds of other components, all working together. There are almost fifty Node packages and hundreds of their dependencies loaded by npm, in addition to another dozen loaded by Bower.

Installing, configuring, and managing all the parts of a modern web application requires a basic working knowledge of these technologies. Understanding how Bower and npm install and manage packages, how Grunt builds, tests, and serves the application with ExpressJS, how Yo scaffolds applications, how Karma and Jasmine run unit tests, or how Mongoose and MongoDB work together, are all essential. This brief post will primarily focus on retrieving and displaying data, not necessarily how the components all work, or work together.

Installing and Configuring the Project

Environment Variables

To start, we need to create (3) environment variables. The NODE_ENV environment variable is used to determine the environment our application is operating within. The NODE_ENV variable determines which configuration file in the project is read by the application when it starts. The configuration files contain variables, specific to that environment. There are (4) configuration files included in the project. They are ‘development’, ‘test’, ‘production’, and ‘travis’ (travis-ci.org). The NODE_ENV variable is referenced extensively throughout the project. If the NODE_ENV variable is not set, the application will default to ‘development‘.

For this post, set the NODE_ENV variable to ‘test‘. The value, ‘test‘, corresponds to the ‘test‘ configuration file (‘meanstack-data-samples\config\environments\test.js‘), shown below.

// set up =====================================
var express          = require('express');
var bodyParser       = require('body-parser');
var errorHandler     = require('errorhandler');
var favicon          = require('serve-favicon');
var logger           = require('morgan');
var cookieParser     = require('cookie-parser');
var methodOverride   = require('method-override');
var session          = require('express-session');
var path             = require('path');
var env              = process.env.NODE_ENV || 'development';

module.exports = function (app) {
    if ('test' == env) {
        console.log('environment = test');
        app.use(function staticsPlaceholder(req, res, next) {
            return next();
        });
        app.set('db', 'mongodb://localhost/meanstack-test');
        app.set('port', process.env.PORT || 3000);
        app.set('views', path.join(app.directory, '/app'));
        app.engine('html', require('ejs').renderFile);
        app.set('view engine', 'html');
        app.use(favicon('./app/favicon.ico'));
        app.use(logger('dev'));
        app.use(bodyParser());
        app.use(methodOverride());
        app.use(cookieParser('your secret here'));
        app.use(session());

        app.use(function middlewarePlaceholder(req, res, next) {
            return next();
        });

        app.use(errorHandler());
    }
};

The second environment variable is PORT. The application starts on the port indicated by the PORT variable, for example, ‘localhost:3000′. If the the PORT variable is not set, the application will default to port ‘3000‘, as specified in the each of the environment configuration files and the ‘Gruntfile.js’ Grunt configuration file.

Lastly, the CHROME_BIN environment variable is used Karma, the test runner for JavaScript, to determine the correct path to browser’s binary file. Details of this variable are discussed in detail on Karma’s site. In my case, the value for the CHROME_BIN is ‘C:\Program Files (x86)\Google\Chrome\Application\chrome.exe'. This variable is only necessary if you will be configuring Karma to use Chrome to run the tests. The browser can be changes to any browser, including PhantomJS. See the discussion at the end of this post regarding browser choice for Karma.

You can easily set all the environment variables on Windows from a command prompt, with the following commands. Remember to exit and re-open your interactive shell or command prompt window after adding the variables so they can be used.

Install and Configure the Project

To install and configure the project, we start by cloning the ‘meanstack-data-samples‘ project from GitHub. We then use npm and bower to install the project’s dependencies. Once installed, we create and populate the Mongo database. We then use Grunt and Karma to unit test the project. Finally, we will use Grunt to start the Express Server and run the application. This is all accomplished with only a few individual commands. Please note, the ‘npm install’ command could take several minutes to complete, depending on your network speed; the project has many direct and indirect Node.js dependencies.

If everything was installed correctly, running the ‘grunt test’ command should result in output similar to below:

Results of Running 'grunt test' with Chrome

Results of Running ‘grunt test’ with Chrome

If everything was installed correctly, running the ‘grunt server’ command should result in output similar to below:

Results of Running 'grunt server' to Start Application

Results of Running ‘grunt server’ to Start Application

Running the ‘grunt server’ command should start the application and open your browser to the default view, as shown below:

Displaying the Application's Google Search Results on Desktop Browser

Displaying the Application’s Google Search Results on Desktop Browser

Karma’s Browser Choice for Unit Tests

The GitHub project is currently configured to use Chrome for running Karma’s unit tests in the ‘development’ and ‘test’ environments. For the ‘travis’ environment, it uses PhantomJS. If you do not have Chrome installed on your machine, the ‘grunt test’ task will fail during the ‘karma:unit’ task portion. To change Karma’s browser preference, simply change the ‘testBrowser’ variable in the ‘./karma.conf.js’ file, as shown below.

I recommend installing and using  PhantomJS headless WebKit, locally. Since PhantomJS is headless, Karma runs the unit tests without having to open and close browser windows. To run this project on continuous integration servers, like Jenkins or Travis-CI, you must PhantomJS. If you decide to use PhantomJS on Windows, don’t forget add the PhantomJS executable directory path to your ‘PATH’ environment variable to, after downloading and installing the application.

 

Code Generator

As I mentioned at the start of this post, this project was based on William Lepinski’s ‘generator-meanstack‘, which is in turn is based on Yeoman’s ‘generator-angular‘. Optionally, to install the ‘generator-meanstack’ npm package, globally, on our system use the following command The  ‘generator-meanstack’ code generator will allow us to generate additional AngularJS components automatically, within the project, if we choose. The ‘generator-meanstack’ is not required for this post.

npm install -g generator-meanstack

 

Part II

In part two of this post, we will explore each methods of retrieving and displaying data using AngularJS, in detail.

Links

, , , , , , , , , , , , , , , , ,

1 Comment

Follow

Get every new post delivered to your Inbox.

Join 700 other followers