Angular CLI 6.1.0 is out (in fact we even have a 6.1.1 available)!

It is less feature rich than the previous releases: most of the work in this release consists in refactorings and bug fixes.

If you want to upgrade to 6.1.1 without pain (or to any other version, by the way), I have created a Github project to help: angular-cli-diff. Choose the version you’re currently using (6.0.0 for example), and the target version (6.1.1 for example), and it gives you a diff of all files created by the CLI: angular-cli-diff/compare/6.0.0…6.1.1. You have no excuse for staying behind anymore!

Let’s see what we’ve got!

Internal refactoring

Even if that’s not super useful to you as a developer, the devkit project (upon which the CLI relies a lot internally) is now in the same repository than the angular-cli project.

It used to be slightly painful to open issues and contribute code, because it was hard to figure out which repository the issue/code belonged to.

The angular/devkit repository has been archived, and imported back into the angular/angular-cli repository, which is now the only source of truth.

ES2015 modules everywhere

If you check angular-cli-diff/compare/6.0.0…6.1.1, you’ll see that one of the changes is that "module": "es2015" is now used in all tsconfig.json files. It means that we now have the same behaviour when serving/building/testing the app.

Vendor source map

A new option has been introduced called vendorSourceMap allowing to have source maps for vendor packages. You can use it with:

ng build --prod --source-map --vendor-source-map

This can be useful for debugging your production packages and see what is really included, thanks to source-map-explorer.

For example, this is with sourceMap only:

Source maps

and the same source maps built with vendorSourceMap:

Vendor source maps

This is all for this small release, except the support of TypeScript 2.8 and 2.9 and the support of Angular 6.1 of course. You can check out what’s new in Angular 6.1 in our previous blog post.

All our materials (ebook, online training and training) are up-to-date with these changes if you want to learn more!

Cédric Exbrayat

Become a ninja with Angular

Cover of ebook Become a ninja with Angular

Pay what you want and support charity!



17-19/09 à Lyon
22-24/10 à Paris

Angular avancé

20-21/09 à Lyon
25-26/10 à Paris


Angular 6.1.0 is here!

Angular logo

keyvalue pipe

Angular 6.1 introduced a new pipe! It allows iterating over a Map or an object, and displaying the keys/values in our templates.

Note that it orders the keys:

  • first lexicographically if they are both strings
  • then by their value if they are both numbers
  • then by their boolean value if they are both booleans (false before true).

And if the keys have different types, they will be cast to strings and then compared.

  selector: 'ns-ponies',
  template: `
      <!-- entry contains { key: number, value: PonyModel } -->
      <li *ngFor="let entry of ponies | keyvalue">
        {{ entry.key }} - {{ }}
export class PoniesComponent {
  ponies = new Map<number, PonyModel>();

  constructor() {
    this.ponies.set(103, { name: 'Rainbow Dash' });
    this.ponies.set(56, { name: 'Pinkie Pie' });

If you have null or undefined keys, they will be displayed at the end.

It’s also possible to define your own comparator function:

  selector: 'ns-ponies',
  template: `
      <!-- entry contains { key: PonyModel, value: number } -->
      <li *ngFor="let entry of poniesWithScore | keyvalue:ponyComparator">
        {{ }} - {{ entry.value }}
export class PoniesComponent {

  poniesWithScore = new Map<PonyModel, number>();

  constructor() {
    this.poniesWithScore.set({ name: 'Rainbow Dash' }, 430);
    this.poniesWithScore.set({ name: 'Pinkie Pie' }, 125);

   * Defines a custom comparator to order the elements by the name of the PonyModel (the key)
  ponyComparator(a: KeyValue<PonyModel, number>, b: KeyValue<PonyModel, number>) {
    if ( === {
      return 0;
    return < ? -1 : 1;

TypeScript 2.9 support

Angular 6.0 was stuck with TS 2.7, but Angular 6.1 catches up and adds support for TS 2.8 and 2.9.

You can check out what these new versions bring on the Microsoft blog:

Shadow DOM v1 support

As you may know, Angular offers an encapsulation option that allows to scope CSS styles to their component, and their component only.

Until 6.1, Angular had three available options for this encapsulation option:

  • Emulated, which is the default one
  • Native, which relies on Shadow DOM v0
  • None, which means you don’t want encapsulation

Angular 6.1 introduces a new option: ShadowDom, which relies on Shadow DOM v1, the latest version of the specification. Theoretically, it should be replacing the Native option (as the Shadow DOM v0 specification is now deprecated), but it would be a breaking change, so the team decided to introduce a brand new option.

If you’re into it, you can check out this awesome blog post listing the differences between Shadow DOM v0 and Shadow DOM v1. You can see the current support from the major browsers here. The support for Shadow DOM v1 will be better than for Shadow DOM v0 in the near future, as more browser vendors feel this is the right way to go.

Angular abstracts all the nitty gritty things to know about that, as you just have one option to switch to use Shadow DOM v1, and that’s pretty cool.

This new support also allows Angular Element to be used with the slot elements for basic native content projection.

Tree-shakeable services in core

You may remember that Angular 6.0 introduced tree-shakeable services, with the possiblity to declare a service using @Injectable({ providedIn: 'root' }). The core services of the framework are starting to move to this new declaration, with the first two services: Title (which allows setting the title of the page) and Meta (which allows setting the metadata of the page).

It means that if you are not using them in your application, they will now not end up in your final bundle, saving a few bytes of JavaScript to send to our users.

Router scrolling position restoration

The router received some love in this release with the addition of a few features. The first one is an option allowing to restore the scrolling position when you navigate back to a component.

You simply have to add the option to your RouterModule configuration:

imports: [
  RouterModule.forRoot(routes, {
    scrollPositionRestoration: 'enabled'

Three differents values can be passed to this option:

  • disabled, which does nothing (default).
  • top, which sets the scroll position to [0,0].
  • enabled, which sets the scroll position to the stored position.

The enabled option will be the default in the future. With this option, the router stores the scroll position when navigating forward, and restores it when navigating back. When navigating forward, the scroll position will be set to [0, 0], or to the anchor if one is provided.

It also adds an anchorScrolling option, to configure if the router should scroll to the element when the url has a fragment. It has two possible values:

  • disabled, which does nothing (default).
  • enabled, which scrolls to the element. This option will be the default in the future.

And there is also a scrollOffset option, if you want to add an offset to the scrolling. It accepts a position, or a function returning a position.

The router now also emits a new event called Scroll that you can listen to.

On paper, this looks super handy: if you have a very long template in a component, when a user navigates back to it, she will end up on her last scrolling position.

I say “on paper”, because in reality this only works with static content! If you have dynamic content displayed in the template (let’s say a very long list that you fetch from the server), the router will attempt to scroll even before the content is inserted… So it won’t scroll to the correct position, because this position will not exist when the router tries to scroll to it.

If you are in a case like this, you’ll have to write tedious code to trigger the scroll yourself in the component, by using a new service offered by the @angular/router package, called ViewportScroller.

You could think that if the data are loaded via a resolver, the router would handle it correctly, because the data are loaded before the component is displayed, so it would make sense that the router would scroll to the right position in that case.

But sadly, currently, no… We opened an issue right away with this feedback (you can add a thumb up if you agree), but it is currently not adressed in 6.1.0.

So if you have dynamic content, you’ll have to handle the scroll yourself, by writing tedious code looking like this, even if the data comes from a resolver:

export class PendingRacesComponent {
  scrollPosition: [number, number];
  races: Array<RaceModel>;

  constructor(route: ActivatedRoute, private router: Router, private viewportScroller: ViewportScroller) {
    this.races =['races'];
      filter(e => e instanceof Scroll)
    ).subscribe(e => {
      if ((e as Scroll).position) {
        this.scrollPosition = (e as Scroll).position;
      } else {
        this.scrollPosition = [0, 0];

  ngAfterViewInit() {


And you’ll have to do the same in every component where you want the scroll position to be restored…

Router — URI error handler

You may have noticed that if a user tries to access a badly formed URL in your Angular application, the router will redirect to the root of the application.

Angular 6.1 introduces a new function called malformedUriErrorHandler that you can provide to redirect your user to a different page.

imports: [
  RouterTestingModule.forRoot(routes, {
      // redirects the user to `/invalid-uri`
      (error: URIError, urlSerializer: UrlSerializer, url: string) => urlSerializer.parse('/invalid-uri')

As you can see, the handler receives the badly formed URL and the error, so you can even display a proper error to your users if you want.

Router — URL update strategy

In the same vein, if the router navigates to a component, and the navigation fails, the URL is currently not updated.

A new option urlUpdateStrategy has been introduced, and can receive either: deferred or eager. deferred is the default and only updates the URL if the navigation succeeds, as it is the case currently. eager will start by updating the URL and then navigate to the component, so the URL will be updated even if the navigation fails.

Angular CLI 6.1

The CLI has also been released in 6.1.0: check out our other article about what’s new!

All our materials (ebook, online training and training) are up-to-date with these changes if you want to learn more!

Cédric Exbrayat

Become a ninja with Angular

Cover of ebook Become a ninja with Angular

Pay what you want and support charity!



17-19/09 à Lyon
22-24/10 à Paris

Angular avancé

20-21/09 à Lyon
25-26/10 à Paris


ngx-valdemort logo

We recently introduced ngx-speculoos, which reduces boilerplate in Angular unit tests. Check it out if you missed it.

Another place where a lot of boilerplate is needed is forms, and especially in validation error messages. Here’s an example of such boilerplate:

<div class="invalid-feedback" *ngIf="form.get('email').invalid && (f.submitted || form.get('email').touched)">
  <div *ngIf="form.get('email').hasError('required')">
    The email is required
  <div *ngIf="form.get('email').hasError('email')">
    The email must be a valid email address

That is just for two error messages, on one field of one form.

When you do that for all fields of all your forms, you end up with a lot of duplication of the same logic, and a high risk of misspelling control names.

Developers also end up copying and pasting these snippets, and tend to forget to rename the field name or error types in one or two places, introducing bugs.

Adding a new validation rule on a field means that a new error message must also be added.

Wouldn’t it be nice to be able to replace that mess with something like this?

<val-errors controlName="email" label="The email"></val-errors>

That’s what ngx-valdemort allows. And much more. You can override a default message by a custom one when needed. You can choose if you want one or all error messages. You can configure when to display error messages, in a central place, to ensure consistency in all your forms.

Learn more and see it in action on our project page.

It’s free and open-source. Tell us if you like it. Also tell us if you don’t: we could improve it. The project is on Github.

JB Nizet

Become a ninja with Angular

Cover of ebook Become a ninja with Angular

Pay what you want and support charity!



17-19/09 à Lyon
22-24/10 à Paris

Angular avancé

20-21/09 à Lyon
25-26/10 à Paris


ngx-speculoos logo

Writing Angular unit tests for components quickly leads to quite a lot of boilerplate, and if you’re not careful, code duplication and not type-safe code, too. Especially when dealing with forms.

Out of the frustration from this non-ideal code, we decided to write a small library to help with these issues, and to rely on the page object pattern when it makes sense.

Let me thus introduce ngx-speculoos.

It’s free, as in beer, and as in speach.

It uses the standard Angular TestBed and ComponentFixture abstractions, so you should be up to speed in a few minutes.

So, if you’re like us, and would like your tests to be cleaner, more readable, and easier to maintain, please give it a try and tell us what you think about it.

Since a code snippet is worth a thousand words, here’s how you would test that selecting a country in a select box makes an error message disappear, and another cities select box appear, containing expected option values, labels and selection. Note the absence of calls to detectChanges or dispatchEvent. Note the non-duplication of CSS selectors thanks to the page object pattern. And note the (optional) usage of some custom matchers.

    expect(tester.countryErrors).toContainText('The country is mandatory'); 

    expect(['PARIS', 'LYON']);
    expect(['Paris', 'Lyon']);

For more information, see our README and API documentation.

The project is on Github, so don’t hesitate to star the project if you like it, and to request features, improvements or bug fixes, or even to contribute.

What’s that name?

speculoos cookies

Well, ngx stands for Angular extension.

Oh, you meant the other part of the name?

A speculoos is a delicious cookie from Belgium, where one quarter of the Ninja Squad team (i.e. me) comes from.

And speculoos starts with spec, which is how test files are usually named in an Angular project. That sounded like a cool name for this library.

JB Nizet

Become a ninja with Angular

Cover of ebook Become a ninja with Angular

Pay what you want and support charity!



17-19/09 à Lyon
22-24/10 à Paris

Angular avancé

20-21/09 à Lyon
25-26/10 à Paris


Sometimes you don’t want a full Angular app. Sometimes you just want to build a widget. Or maybe you have several teams, some using React, Vue and others Angular. Right now it’s not really easy to integrate just one Angular component, into an app that is not an Angular app.

Angular Labs

But some people fight for a better Web and think that a new standard can save us all: Web Components. Web components are actually 4 different specifications:

  • HTML templates (the template tag)
  • Shadow DOM (view encapsulation)
  • HTML Imports (more or less a dead specification)
  • and the one we are interested in: Custom Elements

Note that it is already possible to use a Web Component in an Angular app, and it works seamlessly. But we had no way of exposing our Angular Components as standard Custom Elements, to use them outside of an Angular app.

Custom Elements give us the ability to declare an element, which is not a standard HTML element, but a… custom one. Like admin-user, or responsive-image, or funky-carousel.

I deep dived into the official specification to learn a bit more about the details of Custom Elements. You can of course build your own Custom Element with vanilla JavaScript but there is a bit of “plumbing” to do (you have to write an ES6 class with a constructor that follows some rules, then observe the attributes that can change, then implement the correct lifecycle methods defined in the specification).

That is why Angular 6 introduces @angular/elements! Angular Elements are classic components packaged as Custom Elements.

When you package an Angular Component as an Angular Element, you can then use it like a standard Custom Element. It will bootstrap itself, and create an NgElement (custom element) that hosts the component. It also builds a bridge between the standard DOM APIs and the underlying Angular Component, by doing the plumbing between the component’s inputs and the custom element properties, between its outputs and the custom element events, and between its attributes.

To use it, build a component as usual:

  selector: 'ns-pony',
  template: `<p (click)="onClick()">{{ ponyName }}</p>`
export class PonyComponent {
  @Input() ponyName;
  @Output() selected = new EventEmitter<boolean>();

  onClick() {

Add it to a module (here PonyModule) and then you can register it in another (non Angular) application to use it as a Custom Element:

import { createCustomElement } from '@angular/elements';
import { platformBrowserDynamic } from '@angular/platform-browser-dynamic';

import { PonyComponent, PonyModule } from './pony.module';

  .then(({ injector }) => {
    // get the ES6 class
    const PonyElement = createCustomElement(PonyComponent, { injector });
    // use it to register the custom element
    window.customElements.define('ns-pony', PonyElement);

Once that’s done, you can use the element ns-pony as if it is a standard element:

<ns-pony pony-name="Rainbow Dash"></ns-pony>

Note that the attribute is in kebab-case, whereas the property is in camelCase.

The element can be updated with your favorite framework supporting Custom Elements (like VueJS, Preact but not (yet) React, see Custom Elements Everywhere). Or you can of course use Vanilla JS:

const ponyComponent = document.querySelector('ns-pony');

// update the pony's name
setTimeout(() => ponyComponent.ponyName = 'Pinkie Pie', 3000);

// listen to the custom event
ponyComponent.addEventListener('selected', event => console.log('selected!', event));

You can even create new components and insert them, they will be automatically upgraded to custom elements (and the inner PonyComponent will be instantiated)!

const PonyComponent = customElements.get('ns-pony');
const otherPony = new PonyComponent();
otherPony.ponyName = 'Applejack';

The API is still very young (it was in Angular Labs for the past 6 months), so I would not recommend using it in production yet. But this time will come!

Check out our ebook, online training (Pro Pack) and training if you want to learn more!

Cédric Exbrayat

Become a ninja with Angular

Cover of ebook Become a ninja with Angular

Pay what you want and support charity!



17-19/09 à Lyon
22-24/10 à Paris

Angular avancé

20-21/09 à Lyon
25-26/10 à Paris


Kotlin logo

Cyril and Agnès already told you about our side project for Globe 42, a local non-profit organization which helps old migrants in Saint-Etienne. We started this project as a traditional Spring Boot Java backend, with an Angular frontend. The backend uses Spring MVC to expose RESTful endpoints, and uses JPA to access a PostgreSQL database. The whole application is built with Gradle.

We finally decided to migrate it to Kotlin: it’s a nice medium-sized project to learn more about it, and, having already played with Kotlin and liking it a lot, there was no reason not to do it.

We also decided to migrate the gradle scripts to the Kotlin DSL. If that’s what you’re interested about, you can jump to the last section of this article.

Here’s how it went. If you’re interested into the end result, the code is on GitHub.


I did everything by myself, in a single pass: everything (except the gradle build) was migrated in a single (but long) day. Small refinements and adjustments were brought in later.

Beware though. Globe42 is a relatively small project (around 10,000 lines of Java code, not counting comments and blank lines, but including tests). For a larger project, it would probably be wiser to split such a migration into smaller pieces. But anyway, the strategy I adopted can probably be used.

This migration was also made easy by the facts that I already had this idea of migrating it to Kotlin when the project was started, and the code was written with most of the best practices making it easy to migrate and making the code Kotlin-friendly: constructor injection, immutable DTOs, etc.

I began without a real plan, taking the first package arbitrarily, and migrating it. This wasn’t a good idea: the package depended on other downstream packages (still written in Java). This means that the migrated code still used a lot of platform types, although I knew that these platform types would disappear later during the migration. I would thus have to come back to the migrated code later, once its downstream dependencies would have been migrated. It became clear that the good strategy was to start with the lowest layers of the code (entities, DTOs), then go up through the layers (DAOs, then services and controllers, and finally tests).

The migration was in fact largely automated by IntelliJ, which has a nice Convert Java file to Kotlin action. Note that, despite its name, this action can actually be executed on several files, or a whole directory at once.

Migration issues

The converter does a pretty good job, but isn’t perfect, and can’t read in your mind. Here are some of the things I had to change manually.


The code uses static final fields as values of annotation attributes:

    private static final String PERSON_GENERATOR = "PersonGenerator";
    @SequenceGenerator(name = PERSON_GENERATOR, sequenceName = "PERSON_SEQ")

The converter transforms the constant to a field of the companion object of the class:

    companion object {
        private val PERSON_GENERATOR = "PersonGenerator"

        name = PERSON_GENERATOR,
        sequenceName = "PERSON_SEQ"

But only const val can be used as a value of an annotation attribute. So that code doesn’t compile.

Field injections

The production code actually has one field injection:

    private EntityManager em;

The converter converts it to

    private val em: EntityManager? = null

This is technically correct. But not semantically correct. The code will never see em as null. It’s just that it will be initialized right after construction by Spring. So the semantically correct code is

    private lateinit var em: EntityManager

Another place where there are lots of field injections is in tests (using the @Mock and @InjectMocks annotations of Mockito, and the @MockBean annotation of Spring). Once again, all those have to manually be changed to lateinit var instead of nullable properties.

Not null fields of JPA entities

JPA entities work basically the same way as field-injected Spring beans. When reading an entity from the database, JPA constructs it by using the no-arg constructor, and then populates its fields. This means that, although a person always has a gender in the database, the gender field of the entity is initially null, and then populated by JPA.

So, the following code

    private Gender gender;

is converted to

    var gender: Gender? = null

Once again, this is technically correct, but not semantically correct. The gender is not supposed to be null. It’s just always supposed to be initialized after construction.

Note that even in the case of the creation of a new person (where our own code invokes the constructor and populates the entity), the gender is supposed to be set, either directly in the constructor, or right after construction, before it’s ever being read.

So I decided to change this code to

    lateinit var gender: Gender

It should now be clearer why starting with the downstream layers of the code is a better idea. Having a Gender rather than a Gender? here allows upstream layers to rely on this non-null type, and thus makes the code simpler, more idiomatic, and easier to convert (since the Java code already made that assumption that the gender could never be null).

Entity IDs

We use Long for most of our entity IDs. And they are all auto-generated by JPA. Once again, this means that technically, the ID is nullable, but that semantically, the ID should never be read as null: either the entity is read by JPA and the ID is not null, or the entity is created, and we should make sure that the ID is generated (by flushing the EntityManager if necessary) before we read it.

Unfortunately, Kotlin doesn’t support lateinit var for Long.

The code

    lateinit var id: Long

doesn’t compile: 'lateinit' modifier is not allowed on properties of primitive types.

This is surprising to me. Maybe I’m missing something, but Kotlin could deal with this for me by using a java.lang.Long instead of a long, just as it does transparently when using a property of type Long? rather than Long.

But I can’t do much about that, and I preferred not using a primitive type as the ID (Hibernate recommends using nullable, non-primitive types for generated IDs). So I kept using var id: Long? for our IDs, even though it forces us to use the !! operator (in tests mainly). Maybe we’ll change this strategy later. If you have an explanation on why Kotlin doesn’t allow lateinit var on Long, I’d be happy to learn about it.


Our DTOs (sent as JSON from the server, or received as JSON from the browser) were really meant, from the beginning, to be immutable data classes. Except data classes don’t exist (yet) in Java. And IntelliJ can’t read in your mind. So we converted all our DTOs to data classes by hand.

Note that we didn’t use data classes for the JPA entities. This is an anti-pattern to me, for the following reasons:

  • in general, I prefer not to have hashCode() and equals() methods in entities. And data classes do have such methods. equals() and hashCode() on entities are most of the time semantically incorrect because entities are mutable, and are stored in HashSets when used in toMany associations, which break the HashSet contract.
  • It’s sometimes possible, but hard, to write equals() and hashCode() methods correctly for entities, but they should not use their auto-generated ID. And data classes include all the fields of the class in those methods.

Stream operations

This is where the converter really does a bad job at converting code. This simple, idiomatic Java line of code:

    public List<CountryDTO> list() {
        return countryDao.findAllSortedByName()

is converted to this non-compiling Kotlin monstruosity:

    fun list(): List<CountryDTO> {
        return countryDao.findAllSortedByName().stream()
            .map<CountryDTO>(Function<Country, CountryDTO> { CountryDTO(it) })
            .collect<List<CountryDTO>, Any>(Collectors.toList())

So I changed all these pieces of code to the following idiomatic Kotlin code:

    fun list(): List<CountryDTO> {
        return countryDao.findAllSortedByName().map(::CountryDTO)


We use Mockito a lot in our tests. And using Mockito with Kotlin is only really possible with the mockito-kotlin library. So I had to manually change all the calls to when, verify, any, etc. by calls to the mockito-kotlin extension functions (whenever, etc.)

The idiomatic way of naming a test method in Kotlin is to use a real sentence. So I wrote and executed a simple script to change all the methods like

    fun shouldNotUpdateMediationCodeIfLetterStaysTheSame()


   fun `should not update mediation code if letter stays the same`()

Meta annotations

This is the only thing that we could not migrate to Kotlin. We have the following meta-annotation:

    excludeFilters = @ComponentScan.Filter(
        type = FilterType.ASSIGNABLE_TYPE,
        classes = {AuthenticationConfig.class}))
public @interface GlobeMvcTest {
    @AliasFor(annotation = WebMvcTest.class, attribute = "value")
    Class<?>[] value() default {};

I tried everything I could to convert this annotation to Kotlin, until I realized that it was actually not possible due to this known bug: it’s impossible to apply an annotation to an annotation method in Kotlin. So this is the only Java class remaining in our code.

Git history

Renaming files from .java to .kt and converting their content from Java code to Kotlin code in a single commit confuses Git, which thinks you just deleted a bunch of files and created a bunch of newer ones. To preserve the history and help reviewers, I had to rewrite the history, by first creating a commit which only renames files (without changing their content), then a second commit applying the changes.


The takeaway is the following: it’s possible to convert to Kotlin, and the automatic converter helps a lot, but you should start with the downstream, lower layers of the code, and you will have to apply manual adjustments to the converted code, either to fix it, or to make it cleaner and more idiomatic.

In the end, I’m very happy with the result. The code is easier to read. We didn’t actually find bugs thanks to the migration, but the code is now cleaner, and reduced from approximately 10,000 lines of code to 8,000 (mainly due to getters and setters being removed).

We found two small negative side effects though:

  • the code coverage, measured by Jacoco, went down significantly. The reason is that data classes contain a lot of generated code (equals, hashCode, copy, component1, component2, etc.) that are never actually used in the code. Not a big deal, but if you have a way to configure jacoco to ignore those methods, I’d be happy to learn about it.
  • the compilation time (which is a ridiculous time compared to the time needed to run the tests, and even more ridiculous compared to the time needed to build the frontend) went from 3 seconds in Java to 10 seconds in Kotlin. This shows how remarkably fast the Java compiler is, and how the Kotlin compiler can probably improve.

Migrating the Gradle build

Migrating the groovy-based gradle scripts to Kotlin was the next natural step. This is much faster to do, because there is much less code to migrate. The difficulty is the lack of documentation. So I wrote a migration guide.

Gradle’s reaction was excellent. Since I wrote it, eskatos, from the Gradle team, kindly improved it, and contacted me to tell me that it would soon become the basis for an official gradle guide, and to ask for contribution to the Kotlin DSL documentation.

So the Gradle documentation should soon include Kotlin examples and guides in addition to Groovy ones \o/.

JB Nizet

Become a ninja with Angular

Cover of ebook Become a ninja with Angular

Pay what you want and support charity!



17-19/09 à Lyon
22-24/10 à Paris

Angular avancé

20-21/09 à Lyon
25-26/10 à Paris


Depuis un an Ninja Squad donne de son temps pour le développement d’un logiciel un peu particulier. Il s’agit d’un outil sécurisé pour une association, Globe 42 qui propose un espace d’éducation populaire et de santé communautaire pour des personnes migrantes agées. Ninja Squad avait envie d’aider bénévolement cette association qui n’avait pas les moyens de s’offrir le développement d’une application permettant de simplifier leur quotidien.

Présentation de l’association Globe42

Mais qu’entend-on par « santé communautaire » ? D’après l’OMS c’est un « processus par lequel les membres d’une collectivité, géographique ou sociale, conscients de leur appartenance à un même groupe, réfléchissent en commun sur les problèmes de leur santé, expriment leurs besoins prioritaires et participent activement à la mise en place, au déroulement et à l’évaluation des activités les plus aptes à répondre à ces priorités ». L’approche consiste donc à co-construire, avec les personnes qui ont besoin de soins, les bons outils et les bonnes pratiques pour faire en sorte qu’elles aillent réellement mieux. Par exemple, quand un migrant qui parle mal le français et ne le lit pas, voit un médecin : comment faire en sorte qu’il comprenne bien le contenu de l’ordonnance, qu’il change ses pratiques alimentaires si besoin. Ce n’est pas simple à envisager si la prescription s’arrête à la visite d’un quart d’heure dans le cabinet médical. D’où l’idée d’un espace communautaire, où on peut à la fois apprendre le français, parler avec des personnes qui sont dans la même situation, partager un repas, des moments de convivialité, et apprendre à mieux prendre soin de soi.

Malika Lebbal a lancé cette association après plusieurs années de missions classiques d’assistante sociale, et plusieurs années également de bénévolat au sein d’un collectif de soutien pour des sans-papiers.

Malika Lebbal

Après 15 années d’expérience, Malika voulait donc se lancer dans un projet différent, un espace de santé communautaire, bienveillant, où on dépasse les positions parfois trop éloignées entre travailleurs sociaux et personnes qui ont besoin d’aide. Malika parle souvent d’un objectif de « partage de pouvoirs et de savoirs ». Elle s’est inspirée d’exemples comme La case de santé à Toulouse, La place de santé en Seine Saint-Denis, des centres de santé communautaire à Montréal ou à Bruxelles.

Globe42 nait ainsi en 2009 et se focalise rapidement sur les difficultés spécifiques des femmes migrantes âgées, sur le plan de l’accès aux droits et à la santé. L’association ouvre un local à Saint-Etienne, dans le quartier de Chavanelle, là où résident deux ninjas (ceux qui se sont reproduits, et qui ont leurs deux ninjas juniors dans l’école de quartier à 100 mètres du local de Globe42).

Affiche ouverture local Globe 42

Malika a entrepris en 2012 une formation continue, un Master 2. Elle a entamé une recherche action sur le thème de la santé des femmes migrantes âgées. Le choix de la recherche action n’était pas anodin : elle souhaitait que les femmes soient actrices de la recherche, et voyait « la recherche action comme un outil conscientisant avec une démarche collective d’appropriation des données et d’élaboration de réponses adaptées ».

Haaaa mais ça parle de données… un vrai besoin de collecte de données « éthique » (anonymisée, sécurisée) apparaît donc, un truc de ninjas ça !


Les membres de l’association assuraient, jusqu’alors, la gestion de ces données sur leurs membres sans l’aide de l’informatique, pour écarter toute faille de confidentialité qu’un non-spécialiste peut craindre, notamment vis-à-vis du cloud. Il y avait donc une vraie perte de temps et surtout elles ne pouvaient pas croiser certains chiffres qui pouvaient les aider : combien de personnes se rendent aux repas, aux médiations de santé ? Combien de temps passent les membres de l’association sur la médiation sociale, sur l’accueil… ? Elles géraient sur des carnets de note papier le suivi de chaque migrant : les procédures en préfecture, l’état d’avancement des demandes de cartes de séjour, bref, toutes ces informations étaient non informatisées !

Nous avons voulu donc aider cette association en proposant bénévolement nos services de développeurs. Non seulement nous nous offrions un side-project pour pratiquer les technologies du moment (Angular 6, Spring Boot 2, Kotlin, JUnit 5, …) sur une vraie application, mais nous en profitions pour aider une association qui le méritait, et dont on pouvait ainsi faciliter le travail.

Si cela vous intéresse, le code est sur Github. Merci au passage à Clever Cloud qui a accepté de donner une instance gratuite pour l’hébergement de l’application.

Le travail des ninjas

Il est compliqué de dire exactement combien de temps cela nous a pris, 70 à 80 jours peut-être depuis 18 mois, nous ne sommes pas des fans de l’imputation ! On s’est appliqués à suivre, comme dans n’importe quel projet corporate, un processus itératif : on passe du temps à discuter avec les membres de l’association, à comprendre leurs besoins (on participe aux repas du jeudi de l’association, qui mettent à l’honneur un plat d’un pays, cuisinés par une personne migrante et qui réunissent 20 à 30 personnes autour d’une grande tablée).

Repas à Globe42

On est un peu plus calés sur les cartes de séjour maintenant, on se doutait bien qu’il n’était pas facile d’être une personne migrante, mais la complexité administrative pour obtenir des titres de séjour aujourd’hui en France est assez impressionnante. Une fois que les besoins de l’association sont énoncés, priorisés, on se donne quelques semaines pour les implémenter. On se rend de nouveau au local de l’association pour un échange avec les membres (on privilégie le point présentiel, jamais de remote) : on leur fait une démo de ce qu’on a produit, on récolte leur avis, et on prend en compte de nouveaux besoins. Et ainsi de suite, on réitére. Entre chaque rencontre, les membres de l’association utilisent vraiment les nouvelles fonctionnalités, au quotidien. Nous n’avons pas de serveur de recette, suite à la démo et à la correction éventuelle de certains points si nécessaire, on relivre une nouvelle version de l’application en production. Les membres de l’association sont passés ainsi d’un mode tout papier à un mode un peu plus hybride (il y a encore du papier mais beaucoup moins !), un mode beaucoup plus informatisé.

L’engagement des personnes de ce collectif est tellement fort qu’on se sent tout petits vis-à-vis d’elles. Notre implication sur ce projet autour de la santé des personnes migrantes agées est bien sûr modeste, mais c’est une grande satisfaction pour nous de voir qu’on peut les aider à être plus efficaces, on se rend compte à quel point l’application a pu leur faciliter la vie, et ça pour des développeurs c’est quand même chouette!

Vous aussi devenez tech-activist ✊

En tant que développeur, on se demande parfois comment on peut aider des personnes, des associations, des centres sociaux, comment on peut être utile sur des projets sociaux. Sachez que vos compétences techniques peuvent grandement aider. On peut vous garantir qu’il y a pléthore de demandes, il suffit de pousser la porte de certaines structures de votre quartier !

Idalin Bobé lors de la mémorable keynote à MiXiT, en 2016, sur le Tech Activism, nous avait amenés à réfléchir à cette question : quelle contribution sociétale voulons-nous apporter à travers notre métier ? Elle nous encourageait à mettre à disposition, ne serait-ce que quelques heures par mois, nos compétences au service de projets à impact sociétal ou environnemental.

Idalin Bobé

On parle souvent des logiciels et applications, notamment ceux des GAFAM, qui introduisent des algorithmes régissant de manière souvent trop intrusive nos vies, qui peuvent parfois contribuer à ne pas faire en sorte que notre monde aille mieux. Et bien vous pouvez aussi contribuer, tout modestement cela soit-il, à inverser la tendance !

Agnès Crépet

Become a ninja with Angular

Cover of ebook Become a ninja with Angular

Pay what you want and support charity!



17-19/09 à Lyon
22-24/10 à Paris

Angular avancé

20-21/09 à Lyon
25-26/10 à Paris


Angular 6.0.0 is here!

Angular logo

It has a really big novelty which is not really a feature: the new Ivy renderer. As it is still experimental, I’ll close this article with it. We’ll start at first with the other new features and breaking changes.

We made a little video to give you an overview of the new features. If you want to dive deeper into what has changed, keep on reading after watching it ;).

Tree-shakeable providers

There is now a new, recommended, way to register a provider, directly inside the @Injectable() decorator, using the new providedIn attribute. It accepts 'root' as a value or any module of your application. When you use 'root', your injectable will be registered as a singleton in the application, and you don’t need to add it to the providers of the root module. Similarly, if you use providedIn: UsersModule, the injectable is registered as a provider of the UsersModule without adding it to the providers of the module.

  providedIn: 'root'
export class UserService {


This new way has been introduced to have a better tree-shaking in the application. Currently a service added to the providers of a module will end up in the final bundle, even if it is not used in the application, which is a bit sad. And if you use lazy-loading, you can fall in a bunch of traps or end up with the service bundled in the “wrong” place.

It should not happen often in applications (if you write a service, you usually use it), but third party modules sometimes offer services that you don’t use, and you end up with a big bundle of useless JavaScript.

So it will be especially useful for library developers, but it is now the recommended way to register an injectable even for application developpers. The new CLI will even scaffold a service with providedIn: 'root' by default now.

In the same spirit, you can now declare an InjectionToken and directly register it with providedIn and give it a factory:

 export const baseUrl = new InjectionToken<string>('baseUrl', {
    providedIn: 'root',
    factory: () => 'http://localhost:8080/'

Note that it also simplifies unit testing. We used to register the service in the providers of the testing module to be able to test it. Before:

beforeEach(() => TestBed.configureTestingModule({
  providers: [UserService]

Now, if the UserService uses providedIn: 'root':

beforeEach(() => TestBed.configureTestingModule({}));

Don’t worry though: all the services registered with providedIn aren’t loaded in the test, they are instantiated lazily, only when they are really needed.

RxJS 6

Angular 6 now uses RxJS 6 internally, and requires you to update your application also.

And… RxJS 6 changed the way to import things!

In RxJS 5, you were probably writing:

import { Observable } from 'rxjs/Observable';
import 'rxjs/add/observable/of';
import 'rxjs/add/operator/map';

const squares$: Observable<number> = Observable.of(1, 2)
  .map(n => n * n);

RxJS 5.5 introduced the pipeable operators:

import { Observable } from 'rxjs/Observable';
import { of } from 'rxjs/observable/of';
import { map } from 'rxjs/operators';

const squares$: Observable<number> = of(1, 2).pipe(
  map(n => n * n)

And RxJS 6.0 changed the imports:

import { Observable, of } from 'rxjs';
import { map } from 'rxjs/operators';

const squares$: Observable<number> = of(1, 2).pipe(
  map(n => n * n)

So, one day, you’ll have to change the imports across your application. I say “one day” and not “right now” because RxJS released a library called rxjs-compat, that allows you to bump RxJS to version 6.0 even if you, or one of the libraries you’re using, is still using one of the “old” syntaxes.

The Angular team wrote a complete document to explain all this, it’s a must read when you’ll start your Angular 6.0 migration.

Note that a very cool set of tslint rules has been released called rxjs-tslint. It just contains 4 rules that, when added to your project, will automatically migrate all your RxJS imports and RxJS code to the brand new version with a simple tslint --fix! Because, if you don’t know about it, tslint has a fix option that will autocorrect all the violations it can! It can be used in an even simpler way by installing globally rxjs-tslint and running rxjs-5-to-6-migrate -p src/ I gave rxjs-tslint a try on one of our projects, and it worked fairly well (run it at least twice to also collapse all the imports). Check out the project README to learn more:

If you want to discover more about RxJS 6.0, you can watch this talk by Ben Lesh at ng-conf.


The big one for i18n is the upcoming possibility to have “runtime i18n”, without having to build the application once per locale. This is not yet available (there are just prototypes for now), and it will need the Ivy renderer to work (continue reading to know what that is). So we will probably have to wait a few weeks/months before using it.

Another i18n-related change has landed, and this one is immediately available. The currency pipe was improved in a way that makes a lot of sense: it will not round every currency with 2 digits anymore, but will round the currency to the most appropriate digit number (which can be 3 like for the Arabic Dinar of Bahrain, or 0 like for the Chilean Pesos).

If you need to, you can retrieve this value programmatically by using the new i18n function getNumberOfCurrencyDigits.

Other formatting functions have also been exposed publicly, like formatDate, formatCurrency, formatPercent, and formatNumber.

Pretty handy if you need to apply the same transformations than what the pipes do, but from within your TypeScript code.


The polyfill web-animations-js is not necessary anymore for animations in Angular 6.0, except if you are using the AnimationBuilder. Your application may have won a few precious bytes! In the case that the browser does not support the element.animate API, Angular 6.0 will fallback to CSS keyframes.

Angular Elements

Angular Elements is a project that lets you wrap your Angular components as Web Components and embed them in a non-Angular application. This project has been existing for a few months but was in the “Angular Labs” previously (in other words, was still experimental). With v6, it’s now a little bit pushed in the front and officially part of the framework. As it is a big topic by itself, we have a dedicated blog post about it (coming soon).


When you want to grab a reference to an element in your template, you can use @ViewChild or @ViewChildren or even inject the host ElementRef directly. The drawback, in Angular 5.0 or older, is that the said ElementRef had its nativeElement property typed as any.

In Angular 6.0, you can now type ElementRef more strictly if you want:

@ViewChild('loginInput') loginInput: ElementRef<HTMLInputElement>;

ngAfterViewInit() {
  // nativeElement is now an `HTMLInputElement`

Deprecations and breaking changes

Let’s talk about what you should be aware of before attempting a migration!

preserveWhitespaces: false by default

In the “bad things that can happen when you upgrade” section, note that preserveWhitespaces is now false by default. This option was introduced in Angular 4.4, and if you want to know what to expect, you should read our blog post about that. Spoiler: it may be completely fine or break your layouts.

ngModel and reactive forms

It used to be possible to have ngModel and formControl on the same form fields, but this is now deprecated and the support will be removed in Angular 7.0.

It was a bit confusing and was probably not doing exactly what you were expecting (ngModel was not the directive you know, but an input/output on the formControl directive doing slightly the same job, but not exactly the same job). We thought it was confusing too, so we removed the chapter talking about that in our ebook 6 months ago.

So using code like:

<input [(ngModel)]="" [formControl]="nameCtrl">

will now yield a warning.

You can configure your app to emit the warning always (the default), once or never:

imports: [
    warnOnNgModelWithFormControl: 'never'

Anyway, to prepare for Angular 7, you should migrate your code to use either a template-driven form or a reactive form.

Project Ivy: the new (new) Angular renderer

Soooo…. This is the 4th major release of Angular (2, 4, 5, 6), and the 3rd rewrite of the renderer!

For those who don’t know: Angular compiles your templates into equivalent TypeScript code. This TypeScript code is then compiled along with the TypeScript you wrote into JavaScript code, and the result is shipped to your users. And we are now on the 3rd version of this Angular renderer (the first was in the original release Angular 2.0, and the second in Angular 4.0).

This new version of the renderer does not change how you write your templates, but comes with improvements in several fields:

  • build time
  • bundle size

This is still very experimental, and the new Ivy renderer is behind a flag that you have to explicitly set in the compiler options (in the tsconfig.json file) if you want to give it a try.

"angularCompilerOptions": {
  "enableIvy": true

Be warned that it is probably not very reliable, so don’t use it in production right now. It will probably not even work right now. But it will become the default in a near future, so you can give it a spin to see if that works for your app, and what you gain.

Let’s dive into what differs between the old renderer, and the Ivy renderer. You can skip the following sections if you are not interested in the details.

Code generated with the old renderer

Let’s take a small example: a PonyComponent taking a PonyModel (with a name and a color) as input, and displaying an image depending on the color, and displaying the name of the pony.

It looks like:

  selector: 'ns-pony',
  template: `<div>
    <ns-image [src]="getPonyImageUrl()"></ns-image>
export class PonyComponent {
  @Input() ponyModel: PonyModel;

  getPonyImageUrl() {
    return `images/${this.ponyModel.color}.png`;

The renderer introduced in Angular 4 generated a class for each template, called a ngfactory. It would contain (simplified code):

export function View_PonyComponent_0() {
  return viewDef(0, [
    elementDef(0, 0, null, null, 4, "div"),
    elementDef(1, 0, null, null, 1, "ns-image", View_ImageComponent_0),
    directiveDef(2, 49152, null, 0, i2.ImageComponent, { src: [0, "src"] }),
    elementDef(3, 0, null, null, 1, "div"),
    elementDef(4, null, ["", ""])
  ], function (check, view) {
    var component = view.component;
    var currVal_0 = component.getPonyImageUrl();
    check(view, 2, 0, currVal_0);
  }, function (check, view) {
    var component = view.component;
    var currVal_1 =;
    check(view, 4, 0, currVal_1);

This is hard to read, but the main parts of this code are:

  • the structure of the DOM to create, containing element definitions (figure, img, figcaption), their attributes, and text node definitions. Each part of the DOM structure in the view definition array is represented by its index.
  • change detection functions, containing the code used to check if the expressions used in the template evaluate to the same values as before. Here, it checks the result of the getPonyImageUrl method and if it changes, updates the value of the input of the image component. Same with the name of the pony: if it changes, it updates the text node displaying it.

Code generated with Ivy

With Angular 6 and the enableIvy flag set to true, the same example doesn’t generate a separate ngfactory but inlines the information directly in a static field of the component itself (simplified code):

export class PonyComponent {

    static ngComponentDef = defineComponent({
      type: PonyComponent,
      selector: [['ns-pony']],
      factory: () => new PonyComponent(),
      template: (renderFlag, component) {
        if (renderFlag & RenderFlags.Create) {
          elementStart(0, 'figure');
          elementStart(1, 'ns-image');
          elementStart(2, 'div');
        if (renderFlag & RenderFlags.Update) {
          property(1, 'src', component.getPonyImageUrl());
          text(3, interpolate('',, ''));
      inputs: { ponyModel: 'ponyModel' },
      directives: () => [ImageComponent];

    // ... rest of the class


Everything is now contained in this static field. The template attribute contains the equivalent of the ngfactory we used to have, but with a slightly different structure. The template function will be run on every change like before, but has 2 modes:

  • a creation mode when the component is first created and which contains the static DOM nodes to create
  • the rest of the function executed on every change (update the image source if necessary and the text node).

What does that change?

All decorators are now inlined directly into their classes (it’s the same for @Injectable, @Pipe, @Directive) and can be generated with only the knowledge of the current decorator. This is what the Angular team calls the “locality principal”: to re-compile a component, there is no need to analyze the application again.

The generated code is slightly smaller, but more importantly some dependencies are now decoupled, allowing for a faster recompilation when you change one part of the application. It also plays much nicer with modern bundlers like Webpack, and will now really tree-shake the parts of the framework that you don’t use. For example if you have no pipe in your application, the code in the framework that is necessary to interpret pipes is not even included in the final bundle.

Angular used to produce heavy code. That’s not necessarily a problem, but an Hello World application was way too heavy: 37kb after minification and compression. With Ivy-generated code, the tree-shaking process is much more efficient, resulting in smaller bundles \o/. The Hello World is now 7.3kb minified, and only 2.7kb after compression, which is a huuuuuge difference. The TodoMVC app is 12.2kb after compression. These numbers are from the Angular team, and we couldn’t come with some others as you still have to manually patch Ivy to make it work as we speak.

Check out the keynote from ng-conf if you want to learn more.

Compatibility with existing libraries

You might be wondering what will happen with libraries that have already been published using the previous packaging format if your project uses Ivy. Don’t worry, the renderer will produce Ivy-compatible version of the dependencies of your project, even if they are not compiled with Ivy. I’ll spare you the gory details, but it should be transparent to us.

New features

Let’s see what new features we’ll have with this new renderer.

Private properties in templates

The new renderer adds a new feature or potential change.

It is a direct result of the fact that the template function is inlined in a static field of the component: we can now have private properties of our components used in templates. This was not possible until then, and forced us to have all the fields and methods of the component used in the template to be public, as they ended up in a different class (the ngfactory). Accessing a private property from another class would have failed the TypeScript compilation. This is no longer the case: as the template function is inside a static field, it has access to the private properties of the component.

I saw a comment from the Angular team saying that it was not recommended to use private properties in templates, even if it is now possible, as it may not be the case in the future… So you should probably continue to use only public fields in your templates! Anyway, it makes unit tests easier to write, as the test can inspect the state of the component without having to actually generate and inspect the DOM to do so.

Runtime i18n

Note that this new renderer will now allow to have the much awaited possibility of having “runtime i18n”. This is not completely ready, but we saw a few commits that are good signs!

The cool thing is that you should not have to change your application a lot if you are already using i18n. But this time instead of building your application one time for each locale you want to support, you will be able to just load a JSON containing the translations for each locale, and Angular will take care of the rest!

Libraries with AoT code

Right now, a library released on NPM must publish a metadata.json file, and can’t publish the AoT code from its components. Which is sad, because we have to pay the cost of this build in our applications. With Ivy, the metadata file is no longer necessary and library authors should be able to directly ship AoT code to NPM!

Better stack traces

The generated code should now allow for better stack traces when you have an issue in your templates, by yielding a nice error with the line of the template at fault. It will even allow us to put break points in the templates and see what really is going on in Angular.

NgModule will disappear?

It is a far-fetched goal, but in the future we might not need NgModules anymore. This is what tree-shakeable providers are starting to tell us, and it looks like Ivy has the necessary starting blocks for the team to try to remove the need for NgModules (or at least make them less annoying). This is not for right now though, we’ll have to be patient.

This release doesn’t bring a lot of new features, but Ivy is definitely interesting for the future. Give it a try and tell us how it goes for you!

All our materials (ebook, online training and training) are up-to-date with these changes if you want to learn more!

Cédric Exbrayat

Become a ninja with Angular

Cover of ebook Become a ninja with Angular

Pay what you want and support charity!



17-19/09 à Lyon
22-24/10 à Paris

Angular avancé

20-21/09 à Lyon
25-26/10 à Paris


Angular CLI 6.0.0 is out with some nice new features!

The version number can be a bit surprising as the last release was… 1.7! The Angular team decided to now release the CLI with the rest of the framework, hence the big jump. Check out our article about Angular 6.0 if you haven’t!

But it is also a big major release because the internals have changed to offer us more possibilities! Note that the update might not be straightforward, as a few things have changed.

If you want to upgrade to 6.0.0 without pain (or to any other version, by the way), I have created a Github project to help: angular-cli-diff. Choose the version you’re currently using (1.2.1 for example), and the target version (6.0.0 for example), and it gives you a diff of all files created by the CLI: angular-cli-diff/compare/1.2.1…6.0.0. You have no excuse for staying behind anymore!

Let’s see what new features we have!

Support for libraries and multiple applications

This was a long time request from developers, and now we have it! It’s possible with this new version to have several applications in the same CLI project (now called a workspace), and to create libraries (a shared set of components, directives, pipes and services)!

It will now be easier to share a few components across multiple applications for example. A new schematic has been added to help you generate a library.

Just run ng generate library, and it will scaffold the necessary in the projects directory. It relies upon ng-packagr, which was the de facto tool to create Angular libraries, because it handles all the details of packaging the library following the official Angular Packaging Format. Based on ng-packagr, the CLI can now build a library, and produces all the required files (es5 bundle, esm2015 bundle, umd bundle, metadata file for AoT compilation, public API file…). I’m not an expert on the topic, but it looks like you just need to npm publish the result and you’re good to go!

You can also have several applications in your project, with ng generate application. Actually you already have two now by default: your main application and an application containing the e2e tests.

The cool thing is that you can directly import from the library into your applications in the same CLI project, even without publishing the library on NPM.

For example, let’s say you generated a shared library. By default the CLI will produced a shared directory inside projects, with a ShareComponent and a SharedService. You’ll have something like:

- projects
-- shared
--- src
---- lib
----- share.module.ts
----- share.component.ts
----- share.service.ts
- src
-- app
--- app.module.ts
--- app.component.ts
--- ...

If you want to use the SharedService inside your application, for example in app.component.ts, you simply have to import:

import { Component } from '@angular/core';
import { SharedService } from 'shared';

  selector: 'app-root',
  templateUrl: './app.component.html',
  styleUrls: ['./app.component.css']
export class AppComponent {
  title = 'app';

  constructor(sharedService: SharedService) {
    // note the import at the top!

And the CLI will handle it!

This opens great possibilities for large project, and for developers to open source libraries of useful components and services!

A slightly annoying thing right now: when you make a change to the library source, you’ll have to rebuild it manually if you want the rest of the project to see it, because there is no watch mode for ng build in a library (yet).

A new architecture

The CLI as you knew it has been broken down into several small pieces to allow the multi-projects/libraries architecture.

Most of what used to live in the CLI now lives in various schematics. In fact, pretty much everything is a schematic now, and the CLI is just a “schematic runner”. The CLI role is now to execute commands, and it does so with its new “Architect” package (@angular-devkit/architect).

The run command of architect accepts a target (which command to execute) and a project. So, in theory, all commands should be like:

 ng run <project>:<target>[:configuration] [...options]

But a few commands are a special case and can be run directly, like build, lint, test, xi18n. ng serve and ng e2e needs the project to be specified, except if there is just one with this target in the workspace.

So running ng build is the same as running ng run *:build, ng lint my-app is the same as running ng run my-app:lint, ng serve is the same as running ng run my-app:serve, you get it…

A few commands are not delegating to @angular-devkit/architect but to @angular-devkit/schematics. These commands are ng new my-app (which is the same as ng generate @schematics/angular:application my-app), ng update and ng add. But I’ll come back to these two lasts in a dedicated section.

This new architecture comes at a price though: a bunch of configuration files have changed. Some code has been moved around, a new dev dependency has been added (@angular-devkit/build-angular), but most importantly, .angular-cli.json is now deprecated and replaced by angular.json.

This new configuration file looks like:

  "version": 1,
  "newProjectRoot": "projects",
  "projects": {
    "ponyracer": {
      "root": "",
      "projectType": "application",
      "cli": {
        "packageManager": "yarn"
      "architect": {
        "build": {
          "builder": "@angular-devkit/build-angular:browser",
          "options": {
            "outputPath": "dist",
            "index": "src/index.html",
            "main": "src/main.ts",
            // ...

It’s far bigger than this sample of course, but you can find what I was explaining about the new architecture. The new applications or libraries will be generated in the projects directory, My configuration is for one project, called ponyracer and it’s an application. The CLI can be customized to use another package manager like Yarn. And then you have a long section for architect, the command runner. Each available command is a key, for which a builder is needed. For example, build runs @angular-devkit/build-angular:browser, with a bunch of options you can override if you want to.

Migrating to this new configuration is a bit cumbersome, but not that hard. You can do it by hand, using angular-cli-diff to help you, or you can try the brand new ng update feature of the CLI.

ng update and ng add

The ng update command has been introduced in 1.7 but was a glorified npm install. With this release, it starts to express its potential!

It’s now a command that can install packages and run migration scripts automatically. The command will look into the package.json file of the package you’re specifying for a key called ng-update. If it finds one, it will try to run the migration scripts found. You have to specify from which version you update (and to which one if you want to).

The CLI itself offers a migration script to go from 1.x to 6.0. You can run the migration script alone with ng update @angular/cli --migrate-only --from=1.7.4, and it ill automatically add the missing dependencies, move the code around to match the new layout, and migrate the old configuration file to the new angular.json one. It works well enough in that case, even if it was not perfect when we tried it. So give it a try, but don’t trust it blindly and check manually if everything looks good.

RxJS also offers scripts to update your app to RxJS v6, with ng update rxjs --migrate-only --from=5.5.9 for example.

Note that the same is possible with ng add: when adding a package with ng add, the CLI will look for the ng-add key in the package.json file of the package you are installing and will run it. For example, if you add Angular Element to your project with ng add @angular/elements, a script will add the required polyfill to your application. Another example is Angular Material: just run ng add @angular/material and it will set up your application, by adding the CSS imports, the default theme, the necessary module import, etc. Material goes even further and provides a few schematics that you can use. For example, if your run ng generate @angular/material:material-nav --name=nav, it will generate a component NavComponent with the boilerplate necessary in its template to display a navbar.

On the paper, it looks great and kind of what Facebook does for React with the codemod project. In practice, it will greatly depend on whether the eco-system adopts it or not. But this could be quite cool if the feature becomes reliable. We can imagine migrating Angular or the CLI from one version to the next by relying solely on the tooling and one command line!

New schematics

Now that the CLI is broken down into several pieces, we have one package/schematic per functionnality. Let’s have an overview on which packages are currently available:

  • @angular-devkit/build-angular: this is the one to build an Angular application, now a required dependency in your CLI projects.
  • @angular-devkit/build-ng-packagr: this is the schematic for generating and building a library, based on ng-packagr.
  • @angular/pwa: the schematic to transform your app into a Progressive Web App. See our blog post about it for more details about PWA and Service Workers support. Just run ng add @angular/pwa and you’ll have transformed your application into a progressive one!
  • @angular-devkit/build-optimizer: the plugin that makes crazy optimizations to your application, to ship as few code as possible to your users.

Breaking changes

The CLI 6.0 supports only Angular 5.x and 6.x of course (check out our blog post about Angular 6.0), but not Angular 2.x et 4.x anymore.

The minimum NodeJS version has also changed to 8.9+ (and NPM to 5.5+).

The configuration files and the project layout have changed quite a bit, as we pointed out above, so you’ll have to move things around and migrate your configuration files (with ng update and/or manually by checking angular-cli-diff)

Note that the environment concept has slightly changed and is now called a configuration. You can’t run ng build --env=prod anymore as the option has been removed, and building with ng build --prod is now the same as running ng build --configuration=prod. A configuration can contain build options and file replacements. A build option is typically --aot for example. A file replacement is what is done natively with the environment.ts file, which is replaced at build time by as it was previously. The cool thing is that you can create several configurations to avoid memorizing a long command. For example, when you want to build the application in a specific locale, you have to type something like: ng build --aot --output-path=dist/fr --i18n-locale=fr --i18n-format=xlf --i18n-file=src/locale/ (which nobody can remember). With this new configuration system, you can add your configuration to your angular.json file:

"build": {
  "builder": "@angular-devkit/build-angular:browser",
  "configurations": {
    "fr": {
      "aot": true,
      "outputPath": "dist/fr",
      "i18nFile": "src/locale/",
      "i18nFormat": "xlf",
      "i18nLocale": "fr"

A configuration can also contain as many file replacements as you want. For example the production configuration replaces environment.ts by

"configurations": {
  "production": {
    "fileReplacements": [
        "replace": "src/environments/environment.ts",
        "with": "src/environments/"

A configuration is specific to a command. In my example above, I added the fr configuration to the build command, allowing to run ng build --configuration=fr. But you can reuse a configuration for another command by referencing it:

"serve": {
  "builder": "@angular-devkit/build-angular:dev-server",
  "configurations": {
    "fr": {
      "browserTarget": "ponyracer:build:fr"

Another thing that can impact you: the generated files don’t have .bundle or .chunk in their names anymore. main.bundle.js is now main.js, but worst admin.module.chunk.js is now admin-admin-module-ngfactory.js, reflecting that my AdminModule is in an admin directory in my project. That’s to allow people to have two modules with the same name in different locations, at the price of a fucking long name for those who have just one… And inline.bundle.js has been renamed runtime.js. If you have scripts relying on these names, don’t forget to update them.

Also, a few commands have lost or renamed some options and gain others… Don’t be surprised if your usual command does not work right away… For example --single-run have been removed from ng test, and you should now use ng test --watch=false. The kind of stuff that will break a continuous integration (and the developer nerves) when upgrading…

ng get/set has been removed and replaced with ng config, for example you now have to use ng config cli.packageManager yarn.

And to finish, ng eject is currently not supported (but will come back soon).

Now that the unpleasant stuff is out of the way, let’s see what other stuff this new release brings.

Webpack 4

You probably know that under the hood the CLI uses Webpack to do the heavy lifting. Webpack has released the 4.0 version: you can read more about it on the offical blog.

TL;DR: Webpack 4 is faster, should be smarter for bundling common parts of the application, has a new option (sideEffects) that will help to have a better tree-shaking, and adds WebAssembly support.

The Angular CLI team has done an awesome job and integrated Webpack 4 right away in the CLI, and it brings some nice improvements on build times and bundle sizes.

Dynamic lazy-loading

Angular provides a nice way to have lazy-loading in your application via the router. This is usually enough, but sometimes you might find yourself in a situation where you would like to lazy-load a module programmatically, on demand.

Something like:

constructor(loader: SystemJsNgModuleLoader) {
    .then(factory => ...);

The problem was that the CLI was only able to bundle modules separately if they are found in a loadChildren route configuration. So you had to “trick” the CLI and Webpack to build a separate chunk.

With Angular CLI 6.0, that’s no longer necessary. A new option, called lazyModules, can be added to your angular.json, to inform the CLI that you have other NgModules that need to be lazy-loaded, and Webpack will build the necessary chunks:

"lazyModules": [ "app/admin/admin.module" ]

Better error stacks

This is not really a CLI feature as it is a rather old Zone.js feature, but the environment.ts file has been enriched with an import you can uncomment:

import 'zone.js/dist/zone-error';

It transforms the usual stack traces containing all the Zone.js frames into a cleaner and less verbose one containing just the necessary frames.

This is only included in the development environment, because it can have performance impact on your production code.

Installation time

As the CLI is now split in several sub-packages, the installation time should be greatly reduced. So you should not have time to grab a coffee anymore when running npm install --global @angular/cli ;).

To sum up, this release is a huge one for the CLI. It does imply a bit of work from you to migrate, but it’s worth it for the modularity it brings. You might want to let things dry a little though, and wait for a few weeks to upgrade your main projects…

It took us quite some time to update our ebook and online training with these novelties, but we are up-to-date!

Check out our ebook, online training (Pro Pack) and training if you want to learn more!

Cédric Exbrayat

Become a ninja with Angular

Cover of ebook Become a ninja with Angular

Pay what you want and support charity!



17-19/09 à Lyon
22-24/10 à Paris

Angular avancé

20-21/09 à Lyon
25-26/10 à Paris


Angular CLI 1.7.0 is out with some nice new features!

If you want to upgrade to 1.7.0 without pain (or to any other version, by the way), I have created a Github project to help: angular-cli-diff. Choose the version you’re currently using (1.2.1 for example), and the target version (1.7.0 for example), and it gives you a diff of all files created by the CLI: angular-cli-diff/compare/1.2.1…1.7.0. You have no excuse for staying behind anymore!

Let’s see what new features we have!

App budgets

One of the major new features is the ability to set budgets for your applications. In .angular-cli.json, you can now add a new section looking like:

"apps": [
    "budgets": [
      { "type": "bundle", "name": "main", "baseline": "300kb", "warning": "30kb" },
      { "type": "bundle", "name": "races", "maximumWarning": "360kb" },
      { "type": "allScript", "baseline": "1.4mb", "maximumError": "100kb" },
      { "type": "initial", "baseline": "1.6mb", "error": "100kb" },
      { "type": "any", "maximumError": "500kb" }

As you can see, there are several types of budget:

  • bundle, a specific bundle that you name;
  • allScript, all your application scripts;
  • all, all the application;
  • initial, the initial size of the application;
  • anyScript, any one of the script;
  • any, any one of the files.

The sizes are compared to the baseline you specify. If you don’t specify a baseline, then the baseline used is 0.

There are several types of error:

  • maximumWarning: warns you if size > baseline + maximumWarning;
  • minimumWarning: warns you if size < baseline - minimumWarning;
  • warning: same as defining the same maximumWarning and minimumWarning;
  • maximumError: errors if size > baseline + maximumError;
  • minimumError: errors if size < baseline - minimumError;
  • error: same as defining the same maximumError and minimumError.

This is a pretty cool feature, as it allows to keep the size in check without additional tooling (like bundlesize)! And these may be the only budgets your app won’t go over ;)

ng update

Good news, we have now a command to automatically update the Angular dependencies of our CLI applications.

If you use the new CLI 1.7, just run:

ng update

And all your @angular/* dependencies will be updated to the latest stable! This includes all the core packages in your dependencies and devDependencies, but also the CLI itself, and other Angular packages like Material, or DevKit. It does so recursively, so dependencies like rxjs, typescript or zone.js are automatically updated too!

The command does not have a lot of options (only a dry-run option and a next option to update not to the latest stable, but to the next version), so it’s currently an all or nothing process.

But it relies on a schematic (introduced in CLI 1.4, see our blog post), called package-update, that you can use directly. In broad lines, a schematic is a package that contains tasks allowing developers to create code (a full project, a component, a service…) and/or to update code (like updating configuration or classes, adding a dependency, etc…). All the “classic” tasks and blueprints of Angular CLI are in the @schematics/angular package, but the CLI team is gradually rolling in a few new ones to add features, like @schematics/package-update.

This new schematic offers 4 tasks:

  • @angular to update the Angular packages
  • @angular/cli to update the CLI
  • @angular-devkit to update the DevKit
  • all to update all at once

The ng update command calls the all task of the schematic, but you can use the schematic directly if you need or want to.

I’ve never really explained how to do so, so let’s take an example: you only want to update the Angular packages but not the CLI version.

First, install the schematic:

yarn add --dev @schematics/package-update

Add a schematics script in your package.json:

"scripts": {
  "ng": "ng",
  "schematics": "schematics"
  // ...

and run:

yarn schematics @schematics/package-update:@angular

And you’ll only have your Angular packages (and their own dependencies) updated.

You can also specify a version to the schematic:

yarn schematics @schematics/package-update:@angular --version 5.2.3

Configuration simplifications

I usually don’t mention that a few files have changed in the project template, but for once it comes with a few simplifications and new options, so you should definitively take a careful look at all the changes, especially in the:

  • test.ts file (new zone.js import, simplified logic)
  • polyfills.ts file (shows how to use some zone.js capabilities)
  • tslint.json file (rules added and removed)
  • package.json file (lots of dependency bumps)

You can easily see these changes with our angular-cli-diff repository, for example between an old version and the last one: angular-cli-diff/compare/1.2.1…1.7.0

E2e test suites

The e2e task can now take a --suite option, to run only part of your e2e tests. You can define suites of tests in your protractor.conf.js configuration file:

exports.config = {
  suites: {
    perf: 'e2e/perf/**/*.e2e-spec.ts',
    regression: [

And then run:

yarn e2e --suite perf,regression

Service worker safety

Service workers are a really nice feature of modern browsers, and Angular offers a package to help you use them, introduced in Angular 5 (see our blog post). Angular CLI also has a very good support for them, as we explained in our blog post.

But they can also be tricky, as everything involving caching in our industry… If you need to deactivate an already installed service worker, @angular/service-worker will include a safety-worker.js script starting with Angular 6, and the CLI 1.7 adds support to automatically include it in the production bundle. You must then serve the content of this script at the URL of the service worker you want to unregister.

Angular 6 support

As Angular 6 stable is right around the corner (end of March if everything goes well), the CLI is now compatible with it, meaning you can give a try to version 6 right now!

Angular Compiler options

The Angular Compiler options are now supported!

That means if you try to use for example the fullTemplateTypeCheck option introduced in Angular 5.0 (see our blog post), you can now just update the tsconfig.json file of your CLI project, and when you will run ng serve --aot or ng build --prod the option will be picked up!

TypeScript 2.5 and 2.6 support

As Angular 5.1 supports TypeScript 2.5 (see our blog post) and Angular 5.2 now supports TypeScript 2.6 (see our other blog post) , the CLI will no longer complain if you use these TS versions.

Webpack 4 support

As you may know, the CLI uses Webpack under the hood. Webpack is currently in version 3 but the version 4 should not be far away, bringing in some performance enhancements and some nice features (like the side-effect feature which should reduce our bundle sizes, better defaults, WebAssembly support, etc…).

The CLI is getting ready to switch to Webpack 4, and we should enjoy some of these nice features (reduce bundle sizes, faster builds) soon!

Better, faster, higher

The tasks have been slightly improved with the introduction of caching, so your build should be faster!

Check out our ebook, online training (Pro Pack) and training if you want to learn more!

Cédric Exbrayat

Become a ninja with Angular

Cover of ebook Become a ninja with Angular

Pay what you want and support charity!



17-19/09 à Lyon
22-24/10 à Paris

Angular avancé

20-21/09 à Lyon
25-26/10 à Paris


Posts plus anciens