JavaScript Ecosystem Journey: The Great Fragmentation (2010–2015)

From the promise of Node.js and npm to solving incompatible module systems. This is the story of how the JavaScript community fragmented into competing camps and what it took to reunify.

JavaScript Web Development Modules ES6 CommonJS AMD UMD Bundler

Picking Up from the Wild Web Era

In our previous article, we explored how JavaScript evolved from simple script tags to a more sophisticated ecosystem with jQuery, complex applications, and the emergence of Node.js. As we concluded, 2010 marked a crucial turning point where we had the capability to build complex applications but faced a new fundamental problem: incompatible module systems between server (Node.js) and browser environments.

This challenge would define the next five years of JavaScript development, as the community split into competing camps with different approaches to module management.

In this article, we’ll examine how AMD, CommonJS, and UMD emerged as competing solutions, and why this period of fragmentation was both inevitable and necessary for the evolution of modern JavaScript.

ServerJS: Two Worlds Colliding (2009)

Picture this: You’re at JSConf 2009. Chrome has been released last year. Kevin Dangoor (Mozilla) talks about CommonJS (fka ServerJS), a new standard for JavaScript APIs outside the browser. On the same stage, little known web developer Ryan Dahl presents Node.js, combining Chrome’s V8 engine with libuv to handle non-blocking I/O on servers. Node.js embraces the CommonJS module format, giving the standard its killer application. Suddenly you’ve got a clean, synchronous API: require() to import modules, module.exports to export functionality. By early 2010, Isaac Schlueter launches npm and package.json, making it trivial to publish and consume Node.js modules built on CommonJS.

In just two years, modules, packages and server-side JavaScript are born. At a backdrop, Ruby on Rails 2.3/3.0, GitHub, StackOverflow are exploding in popularity. Modern backend web development is taking off.

But then you look back at the browser.

The browser. Where you’re still stitching together <script> tags, where execution happens sequentially, no native module system, where every script evaluation blocks the main thread, developers still rely on global variables to share data between scripts. The old way — concatenating files, managing dependencies by hand, hoping nothing collided in the global namespace — was breaking under the weight of real applications.

Two worlds. One language. And absolutely no way to make them talk to each other.

The community was about to invent competing solutions.

AMD: The Browser’s Answer (2010)

In 2010, James Burke releases RequireJS 0.1, proposing Asynchronous Module Definition as a more browser-compatible alternative to CommonJS.

Multitude of script tags or concatenation scripts no more, just one line and all your scripts are loaded in the correct order using AJAX and evaluated asynchronously:

<script data-main="scripts/main" src="scripts/require.js"></script>
// scripts/math.js
define(['require', 'exports'], function(require, exports) {
  function add(a, b) {
    return a + b;
  }
  return { add };
});

// scripts/main.js
require(['./math'], function(math) {
  console.log(math.add(2, 3)); // 5
});

RequireJS could parse your dependency array, issue parallel HTTP requests for all current nesting level modules, then execute callbacks once everything was in-memory.

But the syntax was verbose. Those nested callbacks felt awkward compared to CommonJS’s straightforward require() statements. Developers looked at AMD and thought: “This solves my problem, but why does it have to be so ugly?”

RequireJS (2010) became the primary AMD implementation, and it found real adoption. But AMD was already a compromise — a solution to a fundamental problem (browsers need asynchronous loading) expressed through awkward syntax.

Browserify: Node Modules in the Browser (2011)

Then a different idea emerged: what if you could just use CommonJS everywhere?

In early 2011, James Halliday introduces Browserify with the tagline “require() in the browser,” offering CommonJS-style module bundling for the browser. Write your code using Node.js-style CommonJS require() statements, then bundle it all into a single file that browsers could execute.

// math.js (works in both Node.js and browsers via Browserify)
function add(a, b) {
  return a + b;
}
module.exports = { add };

// app.js
const { add } = require('./math');
console.log(add(2, 3)); // 5

Browserify would:

  1. Parse your entry point
  2. Recursively walk the dependency tree using Node’s resolution algorithm
  3. Bundle all modules into a single file, each wrapped in a function
  4. Provide a runtime that looks up modules by ID

But it had a critical limitation: it bundled everything upfront. This meant:

  • No lazy loading: A user paying for a 2MB bundle even though they only use 10KB
  • No code splitting: Building separate bundles for different routes required manual intervention
  • Asset handling: Browserify didn’t know what to do with CSS or images; you needed separate tools for those
  • Package incompatibility: npm packages were not designed to work seamlessly in both Node.js and browsers.

UMD: The Compromise Everyone Hated (2011)

By 2011, library authors faced a choice:

  • Publish as CommonJS (works in Node, breaks in browsers)
  • Publish as AMD (requires build step, confuses Node developers)
  • Publish as a global (loses namespacing, clashes with other libraries)

Universal Module Definition (UMD) was the answer: write code that works in every environment.

(function (root, factory) {
  if (typeof define === 'function' && define.amd) {
    // AMD
    define(['jquery'], factory);
  } else if (typeof module === 'object' && module.exports) {
    // CommonJS
    module.exports = factory(require('jquery'));
  } else {
    // Browser globals
    root.MyModule = factory(root.jQuery);
  }
}(typeof self !== 'undefined' ? self : this, function ($) {
  // Module code here
  return MyModule;
}));

UMD was clever engineering. It detected the environment and adapted itself. The same code could run in Node.js, with AMD loaders, or as browser globals. For library authors tired of maintaining multiple versions, UMD was appealing.

But UMD was also ugly. The boilerplate was verbose and confusing. Developers maintained these massive wrapper functions that were the same for every library. And even though there were tools to automate the bundling, it still created a mental burden.

UMD was a symptom of the problem, not a solution. It was a brilliant engineering workaround for a systemic failure: the JavaScript community had competing module formats, and no path to consolidation.

The Fragmentation Crisis (2012–2015)

By 2012, the module format situation had become painful. The ecosystem was fragmenting. Each developer had to choose: pick one module system and accept its limitations, or maintain multiple builds.

A single library like jQuery needed CommonJS, AMD, and UMD builds. Maintaining three versions of your code is expensive.

If you worked on both Node.js and browser applications, you needed to understand both CommonJS and AMD. If you used different libraries, some might be CommonJS, some AMD, some UMD. The context switching was exhausting.

Fragmentation also impacted tooling. Each bundler tended to favor one module format. Browserify pushed CommonJS in browsers. AMD loaders like RequireJS pushed AMD. When you chose a tool, you were implicitly choosing a module format.

And while UMD was a step forward, it still wasn’t an interoperable solution. Using either in a browser required bundlers and transpilers. The simple act of importing code became an engineering problem.

And this pattern would repeat over and over again: the JavaScript community had invented multiple solutions to the same problem, and having multiple solutions was worse than having just one.

The Search for a Standard (2013–2014)

By 2013, people were asking: why can’t we just have one module format? TC39 looked at what the ecosystem had built and recognized the problem: incompatible de facto standards were fragmenting the language. ECMAScript 6 (ES2015) draft specifications from 2013–2015 introduced standardized modules.

The community knew what it needed: a format that was asynchronous (for browsers), static (for tooling optimization), and universal (for both server and browser).

Import export syntax was born:

import { add } from './math.js';
export const result = add(2, 2);

ES6 Modules: Browser and Runtime Implementation (2015-)

ES2015 was officially released in June 2015, introducing standardized modules.

SystemJS was a popular polyfill for ES6 modules in browsers, providing a universal module loader that could load multiple module formats including ES6 (ES2015) modules, CommonJS, AMD, and UMD, while the browser support was maturing.

Chrome implemented modules behind a flag in 2017 (Chrome 60 dev channel), with Firefox and Safari adding production support later in 2017–2018.

But Node.js was another matter entirely. Node.js introduced an experimental —experimental-modules flag in version 8 (2017), with fully supported native ESM integration arriving in Node.js v12–v14 (2019–2021).

When ES6 modules were standardized (2015), Node.js had been using CommonJS for 7 years. Changing the module system overnight would break millions of packages. So Node.js did what seemed prudent: supported both.

Michael Jackson controversy proposed a solution: .js files use CommonJS, .mjs files use ES6. This split made imports ambiguous — when you import { foo } from 'bar', is it looking for bar.mjs or bar.js? So the standard demanded full filename with extension.

But still there was a dual package problem. So package.json came to the rescue:

{
  "main": "index.js",           // CommonJS
  "module": "index.mjs",        // ES6 (npm standard)
  "exports": {
    "require": "./index.cjs",   // CommonJS
    "import": "./index.mjs",    // ES6
    "types": "./index.d.ts"     // TypeScript
  }
}

But which one gets used? It’s entirely up to developer’s configuration and tooling. A misconfigured package.json or bundler would silently load the wrong version or produce cryptic error messages.

Circular dependencies was another problem. CommonJS handled cycles through partially-evaluated modules. ES6 modules handle cycles through live bindings. But if CommonJS imported from ES6, and ES6 imported from CommonJS, how would cycles be resolved? The answer was complicated, so people tend to migrate all the files to one module format and avoid the other like a plague.

The Great Fragmentation in Perspective

Looking back, the fragmentation of 2010-2015 was both necessary and damaging.

The community needed to explore different approaches.

  • AMD proved that browsers need async loading, but that asynchronous resolution makes static analysis nearly impossible.

  • CommonJS proved that synchronous APIs are developer-friendly, but that synchronous resolution doesn’t work across network boundaries.

  • UMD proved that compatibility layers are expensive, both in code size and in mental overhead.

  • ES6 modules showed that static analysis and declarative syntax enable powerful optimizations (tree-shaking, lazy loading) that none of the previous systems could achieve.

But the fragmentation created real costs. It took almost 10 years for ESM to become the norm. During this decade, we lived with:

  • Library authors maintaining multiple formats
  • Developers having to learn multiple standards
  • Tooling complexity to support all formats
  • Runtime performance implications from loading, parsing, and executing multiple module runtimes

Where We Are Now (2025)

Modern bundlers like Esbuild and Rollup still need to understand CommonJS, AMD, and ES6 modules. The ghost of UMD appears whenever a library needs to support both Node.js and browsers. Configuration remains somewhat complex — you still need to know the quirks of package.json export fields and tsconfig.json module options. The module diversity never truly disappeared.

But when you write:

import React from 'react';

You’re relieved from the frustration of a thousand developers who fought through incompatible tradeoffs, discovered fundamental technical constraints, and eventually consolidated around a standard.

What’s remarkable about this period is that while the community was wrestling with module system fragmentation, another parallel evolution was happening simultaneously: the rise of sophisticated bundling tools that tried to bridge these incompatible systems. It’s this story of parallel evolution — module systems fighting for dominance while bundlers worked to unify them — that we’ll explore in the next article of this series.