Website Development is Too Complicated in the 21st Century

Tangled mass of wires

Entanglement by Simon Brass

Call me a Luddite, call me hopelessly nostalgic, call me an old bloke who’s still stuck in the 90’s, but I think web site development has become too complicated in the 21st century. Ah that feels better, I’ve said it now, it’s out there. Bite me!

How it was

Back In The Day (the 90s, let’s call it BITD), all you needed to make a website was a text editor, some free webspace, a link to an HTML primer and some free FTP software off a magazine CD ROM. Add in a few cat pictures and pretty soon you had a “homepage” and very likely a job doing the same for others. Apart from the cat pictures, everything has changed beyond recognition since then.

While it’s true you could still get a website up and running using pretty much the same techniques mentioned above, you would not, in today’s world be considered a “web developer”, nor be able to get a job as one.

How it is

The modern web developer is expected to have a good handle on tech such as (and not limited to): JavaScript, including frameworks/libraries such as jQuery, React, node and npm, CSS, SASS, version control such as GIT, build tools like Gulp or Grunt, throw in a good dollop of PHP or any number of other of HTML pre-processors, and that’s all before you get working with some kind of CMS such as WordPress or Drupal that utilise all of those technologies in their own peculiar ways.

Added to this you’ll need a solid understanding of image types and display techniques and SVGs. Then there’s all the stuff around fonts, debugging tools and coding standards. And of course good ‘ole HTML5. Oh, and you’ll be expected to know your way round a database as well. Gosh, I’m exhausted thinking about it all!

There’s just too much for one person to get and keep their head around, and there’s less demarcation now like there was BITD, where there’d be different people looking after the front, middle and back ends and other people looking after versioning and deployment. Now it seems everyone has to know everything. And so everything suffers because no one knows everything about everything.

But OK, this is progress, things move on, I get that, so what’s my problem? I think the issue I have is that web tech changes and continues to change far too quickly and with little regard to the needs of its users, be they end users or people that use it to build stuff. And anyone who complains is considered backward looking and not cool.

Who’s to blame?

So, who’s to blame for all this? Well, of course there is the influence of the big evil corporations pushing their wares for profit, but much of the new tech comes via the open source route – so I’m going to lay most of the blame on… DEVELOPERS!

Of course we have developers to thank for all the wonderful “developments” we see around us, BUT, we have to understand the developer mindset.

Developers are never happy unless they are solving something, even if it’s something that doesn’t need to be solved. Developers endlessly tinker and try to improve things, make them faster, lighter, easier to use etc. etc. Then, when they consider their work done, they get bored and move onto the the next thing, not interested in the boring stuff like documentation and user testing. I should know I used to consider myself one, BITD.

If you spend any time with developers, you’d be forgiven for thinking that they are in pursuit of a world where everything is fixed and working smoothly and efficiently with a little “ping” at the end. Nothing could be further from the truth. Such a world would be HELL for developers, with nothing to fix and tinker with, they’d start finding ways to break it all and start again. It’s like imagining the police force long for a world free of criminals – if it were they’d be out of a job.

Thus developers find fault with things and build new things to fix them and hope to impress other developers and get applauded at conferences and such. Peer admiration is far more valuable to developers than cash, thus they give their stuff away and we get new stuff for free which is great, but then we have to start using it.

An cautionary tale from history

This page has been archived and is no longer updated

I nailed many of these to the front of a BBC website…

BITD when I was a “client side developer” with the BBC, the big brains decided that some of our sites had become huge and unwieldy with content being locked inside HTML tags. A solution was needed, it was thought, and so a desktop CMS called FLiP was developed. It was written in Perl and used XLST to convert XML documents into HTML web pages. This meant that teams who previously edited static HTML, now edited XML files and pressed a button to see the result.

It was a noble effort but took considerable resources in terms of upskilling and tooling to essentially do what we used to do using Find & Replace inside the Homesite editor. The system frequently broke, not because it was bad but there was just a lot more to go wrong, XSLT/XML is a lot less forgiving than HTML.

But on the plus side, we now had content that was separate from it’s presentation. It could be used in many different ways and redesigned at the touch of a button. The world was ours to repurpose and redesign as we saw fit.

Which would have been great if Erik Huggers, ex of Microsoft hadn’t joined the BBC as Director of BBC Future Media & Technology and at a stroke swept everything away that wasn’t in the top five of our audience figures (pretty much everything that wasn’t news, sport, weather or iPlayer).

So it was all junked, or left mouldering on a hard drive somewhere. What was deemed fit to remain had an archive sign like the one above nailed to the front of it. New technologies were wheeled in, and yes some of us had picked up some new skills but who’s using XML/XSLT now for a CMS?

I know it’s easy to have 20-20 vision with hindsight, but nobody ever asked the question “Is this new thing better than what we’re already doing?” or “How will we measure if it is any better than what we’re already doing?”.

I remember once being exasperated trying to make a small change to a site using FLiP and asking a manager if there were any evidence that we were saving time with this new tech. He answered that he believed a report was going to be produced sometime soon on this very question. I never saw it, and my Google search has thus far proved fruitless 😉 .

My point is that no one evaluated whether the new tech was better or whether it was really needed, it was just assumed that it’s new, its the future and so must be done. And the consequences of putting too much trust in technology without sufficient research, testing, aforethought or documentation can be disastrous as seen with the recent Boeing 737 MCAS system air disaster.

The inefficiency curve

efficiency curve

A curve I made up…

I think we need to regard new tech with deep suspicion, and if it is to be integrated into our process it needs to be fully tested, and those extolling it need to be questioned mercilessly about it, how it will perform and what will happen when they leave or it breaks or it’s no longer supported.

I have a theory that the thing you know is at least twice as efficient as the thing you don’t know, no matter how slick that new thing is. This is just my own anecdotal theory, but I reckon in truth there may be an even wider gap.

Even further BITD when I was an IT teacher I remember having a class of women, office workers/secretaries and the like, mostly in their fifties who were coming in for training on swanky new Microsoft word processing software (Word). They were complaining that they didn’t like this new fangled WYSIWYG drag and drop thing, as they’d been used to software with “Reveal Codes“, a kind of markup for formatting text documents.

These ladies were doing command-line before it was fashionable! Of course WYSIWYG has now largely taken over for end users, for good or bad, but my point is that it would have taken time for these women to be as efficient with the new tech, even if it was better. If you magnify this across a whole team, my guess is that any new system can take up to a year to fully bed-in amongst team members so as to reach peak efficiency.

STABILITY! STABILITY! STABILITY!

stability

It’s over there! Not here.

So how do we fix it, how do we tame the developer beast? Lock them up and allow them out only at weekends under strict supervision? Joking aside we need to understand the developer mentality, and in so doing, allow them their creativity to do what they do, but they should not be the only voice in developing a new process or system.

ALL stakeholders including the “non techy” folks need to have a say and speak up when it comes the potential ramifications of new development. The less techy folks should not simply remain silent because they feel they don’t have enough technical knowledge, they need to speak up. If there is something that some team members are struggling with, it may be indicative that something is too complex, or not documented enough.

It’s no good having god-like developers doling out their wares from from on high to the lowly user scrabbling around in the dirt below. We all need to be part of the process.

More importantly what I think we need more than anything in today’s world is STABILITY. It’s a word that’s perhaps not as sexy as “innovation” or “disruption”, but it is a word we need more than ever, and in the wider world not just the tech world – but that’s another story. I think we need to slow the relentless and dizzying “progress” of web tech somewhat and have a more “make do and mend” mentality.

Instead of embracing every flavour of the month as it comes along, we need to ask tough questions, like what is the end result of what we’re trying to achieve, what is its life cycle, and do we really need this new tech to get to that destination?

When there’s a new tech, idea or process on the horizon, I suggest we all ask the following questions:

    • What exactly is this new tech all about, what does it do, why is it better?
    • What does this new tech give us that we don’t already have?
    • What is the cost of this new tech in terms of upskilling and hard cash?
    • What is the cost of not using this new tech?
    • Will this new tech be around/supported in five years time?
    • Do we already have something that does the same job that we’re not making use of?
    • Who will be responsible for this new tech?
    • Does it make pictures of cats?

Another approach I’d suggest to try is Documentation Driven Development. One the managers at the Beeb championed this BITD, though it doesn’t seem to have taken off. The principle is simple, you write all the documentation for the new tech or product you’re developing BEFORE you write a line of code. This may seem backwards, but it forces you to really think about what you’re making and how users will interact with it before you start building. Added to which, because the documentation is usually the boring part of the job, there’s more chance you’ll stick to what’s really necessary so as to keep it short and not start adding things nobody needs. Might be worth a try.

Of course nobody wants to get left behind in the tech race, and we need to keep up with developments, but we need to think hard about the how, why and when. At the end of the day, we’re mostly just building new ways of making pictures of cats.

 

Leave a Reply

Your email address will not be published. Required fields are marked *