Nothing to see here

This is a test to see if the ironman feed is picking up this category/tag.

Posted in devops | Tagged , | Comments closed

Paying respect to Module::Build

Earlier this week, I proposed that Module::Build be deprecated from the Perl 5 core. After some discussion, this proposal has been accepted by the Pumpking.

I want to take a moment to discuss what this means, why I suggested it, and what I think Module::Build trail blazed for Perl 5.

Deprecation means a warning in v5.19 and removal in v5.22

Soon, in Perl 5 version 19 (the current development series), the Module::Build included in the core library path will issue a warning when used. If Module::Build is then installed from CPAN, it will be installed into the "sitelib" path and the deprecation warning will stop.

Sometime in Perl 5 version 21 (starting late Q2 2014), Module::Build will most likely no longer ship with the core Perl and will need to be installed from CPAN.

Fortunately, CPAN clients already recognize a 'configure requires' directive in CPAN distribution meta files (META.yml or META.json) and can bootstrap Module::Build before Build.PL runs.

So... the biggest impact on end-users will be that when Perl 5 version 20 is released in Q2 2014, people will have to install Module::Build from CPAN to squelch the deprecation warning.

Update: Miyagawa reminded me on IRC that the major CPAN clients will install from CPAN automatically if they see a deprecated module in a prereq list, so actually, most people won't even notice. Shiny!

Module::Build was good, but not good enough, and it plateaued

Before Module::Build, we only had ExtUtils::MakeMaker and it sucked. Michael Schwern, its long-time maintainer, even wrote a presentation called MakeMaker is DOOMED! that encouraged people to switch to Module::Build. In hindsight, this was premature.

For all the many problems that Module::Build fixed, it introduced some of its own, built up its own technical debt, and suffered a crisis of maintenance.

I tried to roughly estimate the amount of effort going into Module::Build during four phases of its life. I used lines of Changes file as my metric (though I think "git diff --stats" would be fairly similar):

  • 2001-2007: Ken Williams author and maintainer → 2397 lines of Changes
  • 2007-2009: Ken and Eric Wilhelm tag-team → 310 lines of Changes
  • 2009-2011: David Golden maintainer → 1033 lines of Changes
  • 2011-now: Leon Timmermans maintainer → 95 lines of Changes

After my announcement that I was stepping down as Module::Build maintainer, no one volunteered for seven months until Leon kindly offered to be a "caretaker" and shepherd some patches and releases -- partly as a side effect of his work on a Module::Build replacement called Module::Build::Tiny, which itself was a serious spin off of a half-joke of my own called Acme::Module::Build::Tiny.

Module::Build innovated things now taken for granted

The best thing that Module::Build did was define a de facto specification for using a Build.PL to drive a perl-based (rather than Makefile-based) install program. That work has been formalized into a Build.PL Spec, so other Perl-based builders can be developed.

Module::Build also introduced the META.yml file that evolved into the CPAN::Meta::Spec that is in widespread use today. The META.yml file also helped solve a tricky bootstrapping problem: by specifying configure_requires dependencies within the META file, CPAN clients could install whatever modules were necessary to run Build.PL.

With the release of Perl v5.10, both CPAN and CPANPLUS supported configure_requires, meaning that the groundwork for future Build.PL-based alternatives was already in place!

Module::Build also introduced the install_base parameter as a way to specify a custom install location. It was much easier to understand than PREFIX from ExtUtils::MakeMaker, and was subsequently adopted by ExtUtils::MakeMaker as INSTALL_BASE. This is a critical part of the magic behind tools like local::lib and Carton.

The other crucial innovation was that — for the first time — customizing the build, test and install process could be done by writing only Perl code rather than writing Perl code to spit out Makefile fragments. It made building complex modules much easier — particularly Alien modules like Alien::wxWidgets. That then made projects like Padre possible.

Module::Build also spawned a counter-reaction in the form of Module::Install, which tried to make the easy Perl customization possible, while shielding users from pure ExtUtils::MakeMaker and avoiding the bootstrap problems of Module::Build by bundling itself in inc/. Module::Install then triggered a counter-reaction in the form of Dist::Zilla, which then led to Dist::Milla and Minilla.

Module::Build was the trail blazer for the tools that came after.

Module::Build made its own unique mistakes

People have complained that Module::Build was bloated. In lines of code, it's actually comparable to ExtUtils::MakeMaker. The bigger problem is that it puts 4,200 of its 5,800 lines of code in just one file: Module::Build::Base. (ExtUtils::MakeMaker split similar functionality across three mega files.)

More than just size, Module::Build is complex. The Build.PL file runs configuration and serializes the results into some files, which the Build file uses to reconstruct the original Module::Build object. Arguments can modify properties at any stage. And since Build.PL might really be a subclass, there's a lot of meta object stuff going on just to manage the configuration before ever getting around to the real business of building and installing modules.

It also suffered from feature creep. Instead of just being an install tool, it became a swiss-army-knife author's tool, with release-time features never needed by end-users, but which forced end-users to upgrade Module::Build just to run Build.PL without error. It added new concepts, like "optional features", which were poorly specified and have never achieved much traction.

One of the big, valid complaints is that it never incorporated a proper dependency system. Actions (build, test, etc.) could depend on each other, but there was nothing like Makefile's ability to detect that since file "A" changed, then action "B" had to run.

My personal pet peeve — possibly one of the big reasons I got discouraged doing maintenance — was that it also included Module::Build::Compat, which was used to generate a Makefile.PL from the Build.PL. While this seemed like a benefit to ease transition, it meant that Module::Build needs to maintain feature-compatibility — and in many cases bug-compatibility — with ExtUtils::MakeMaker effectively forever.

Module::Build promised easy subclassing and this was mostly true. But re-use and sharing was nearly impossible. If you had a subclass to do one thing and I had a subclass that did something else and you wanted to combine them, you pretty much had to copy and paste code. Contrast that with Dist::Zilla's incredible plugin ecosystem — where just about anything you want to do has been written up into a plugin that you can just drop in.

Module::Build will live on as a CPAN distribution

Module::Build never became the uncontested successor to ExtUtils::MakeMaker. It's not used as part of the Perl 5 core build process. It originally went in at least in part to ease adoption, but now all CPAN clients can bootstrap it on demand.

It's not a bad module, but it has no reason to live in the Perl 5 core any more.
Removing it means one less thing for the already-stretched Perl 5 porters to maintain, update and support.

Module::Build helped us through a critical transition away from purely Makefile based installers. It will continue to live on CPAN and will continue to support the thousands of distributions that rely on it. If a motivated maintainer came along, it might even start to innovate again, or pay down its technical debt.

I give it — and its creator, Ken Williams — my respect for what it accomplished, even while I bid it farewell from the core.

Posted in p5p, perl programming, toolchain | Tagged , , , | Comments closed

How I manage new perls with perlbrew

Perl v5.19.0 was released this morning and I already have it installed as my default perl. This post explains how I do it.

First, I manage my perls with perlbrew. I install that, then use it to install some tools I need globally available:

$ perlbrew install-patchperl
$ perlbrew install-cpanm

You must install cpanm with perlbrew -- if you don't, weird things can happen when you switch perls and try to install stuff.

I keep my perls installed read-only and add a local::lib based library called "@std". (I stole this technique from Ricardo Signes.) That way, I can always get back to a clean, stock perl if I need to test something that way.

(There are still some weird warnings that get thrown doing things this way when I switch perls, but everything seems to work.)

I also install perls with an alias, so "19.0" is short for "5.19.0".

Then I have a little program that builds new perls, sets things up the way I want, and installs my usual modules. All I have to do is type this:

$ newperl 19.0

And then I've got a brand new perl I can make into my default perl.

Here's that program. Feel free to adapt to your own neeeds:

#!/usr/bin/env perl
use v5.10;
use strict;
use warnings;
use autodie qw/:all/;

my $as = shift
  or die "Usage: $0 <perl-version>";
my @args = @ARGV;

# trailing "t" means do threads
my @threads = ( $as =~ /t$/ ) ? (qw/-D usethreads/) : ();

$as =~ s/^5\.//;
my $perl = "5.$as";
$perl =~ s/t$//; # strip trailing "t" if any
my $lib = $as . '@std';

my @problem_modules = qw(

my @to_install = qw(

my @no_man = qw/-D man1dir=none -D man3dir=none/;

# install perl and lock it down
system( qw/perlbrew install -j 9 --as/, $as, $perl, @threads, @no_man, @args );
system( qw/chmod -R a-w/, "$ENV{HOME}/perl5/perlbrew/perls/$as" );

# give us a local::lib for installing things
system( qw/perlbrew lib create/, $lib );

# let's avoid any pod tests when we try to install stuff
system( qw/perlbrew exec --with/, $lib, qw/cpanm TAP::Harness::Restricted/ );
local $ENV{HARNESS_SUBCLASS} = "TAP::Harness::Restricted";

# some things need forcing
system( qw/perlbrew exec --with/, $lib, qw/cpanm -f/, @problem_modules );

# now install the rest
system( qw/perlbrew exec --with/, $lib, qw/cpanm/, @to_install );

# repeat to catch any circularity problems
system( qw/perlbrew exec --with/, $lib, qw/cpanm/, @to_install );

Yes, that takes a while. I kicked it off right before going to get lunch. When I got back, I was ready to switch:

$ perlbrew switch 19.0@std

I also have a couple bash aliases/functions that I use for easy, temporary toggling between perls:

alias wp="perlbrew list | grep \@"
up () {
  local perl=$1
  if [ $perl ]; then
    perlbrew use $perl@std
  local current=$(perlbrew list | grep \* | sed -e 's/\* //' )
  echo "Current perl is $current"

I use them like this (notice that I don't need to type my @std library for this fast switching):

$ up
Current perl is 18.0@std

$ wp
* 18.0@std

$ up 19.0
Use of uninitialized value in split at /loader/0x7fa639030cd8/local/ line 8.
Use of uninitialized value in split at /loader/0x7fa639030cd8/local/ line 8.
Current perl is 19.0@std

(there's that warning I mentioned)

I hope this guide helps people keep multiple perls for development and testing. In particular, I'd love to see more more people doing development work and testing using 5.19.X so it can get some real-world testing.

See you June 21 for v5.19.1...

Posted in perl programming | Tagged , , , | Comments closed

Anyone want

I'm tired of paying the domain bill for (which currently just redirects to

It will lapse at the beginning of July unless someone wants to take it.

If you're interested, leave a comment below and explain what you want to do with it. I'll award it to the best proposal received by June 1.

Posted in perl programming | Tagged , , | Comments closed

OODA vs technical debt

This post is a response to Ovid's series about agility without testing:

I started to respond to the last and realized that my comment was long enough to be a blog post of my own.

First, let me say that I'm enjoying this series. Ovid and Abigail are both challenging conventional wisdom around technical debt and I think that's really healthy.

However, I note that Ovid's evidence in favor of emergent behavior is anecdotal, which is probably inevitable for this sort of thing, but dangerous. "It worked these handful of times I remember it" has confirmation bias and no statistical significance.

We can't run a real experiment, but we can run a thought experiment: 100 teams of strict TDD vs 100 teams of the Ovid approach [which he really ought to brand somehow] from the same starting point (perhaps in parallel universes) for a few months of development.

What could we expect? Certainly, the TDD teams will spend more of their time on testing than the Ovid teams. So the Ovid teams will deliver more features and fix more bugs in the same period of time.

If one believes even a little of the Lean Startup hype, the Ovid teams will have more opportunities to see customer reactions — they will have a shorter OODA loop.

On the flip side, the TDD team has less technical debt and lower risk profile. I disagree with the idea that technical debt is an option. I believe it does have an ongoing cost — that future development is less efficient and more time consuming to at least some degree.

I call this "servicing" technical debt, which is just like paying only the interest on your credit card. You might never pay down any of the technical debt, but as you accumulate more, you'll pay more to service it.

It seems clear to me that which result you prefer depends quite a lot on the maturity of the product (possibly expressed in terms of expected growth rate) and the overall risk level.

For a brand-new startup, the risk of failure is already pretty high regardless of coding style. A faster OODA loop probably reduces risk more than improved tests do, because the bigger risk is building something customers don't want. And with such a high risk of failure, there's a chance that you'll simply be able to default on technical debt.

If I can riff on the financial crisis, a startup has subprime technical debt. It's either successful — in which case there will be growth sufficient to pay off technical debt (if the risk/reward tradeoff justifies it) — or it fails, in which case the debt is irrelevant. Rapid growth deflates technical debt.

For a mature business, however, it might well go the other way. Risk to an existing profit stream is more meaningful and technical debt has to be paid off or serviced (rather than defaulted on) which reduces future profitability that might not be sufficiently offset by growth.

The quandary will be businesses — or products (if part of an established business) — that are in between infancy and maturity. There the "right" approach will depend heavily on the risk tolerance and growth prospects.

Regardless, I tend to strongly favor TDD at the unit-test level, where I find TDD helps me better define what I want out of a particular piece of code and then be sure that subsequent changes don't break that. At the unit test level, the effort put into testing can be pretty low and the benefits to my future code productivity fairly high.

But as the effort of testing some piece of behavior increases — due to external dependencies or other interaction effects — it's more tempting to me to let go of TDD and rely on human observation of the emergent behaviors because I'd rather spend my time coding features than tests.

I think that puts me a little closer to the Ovid camp than the strict TDD camp, but not all the way.

Posted in coding | Tagged , , , | Comments closed

© 2009-2015 David Golden All Rights Reserved