Legs

Two things we tend to believe about computers:
1) Every year they'll get faster, more powerful, and maybe even easier to use. The same for software.
2) Most people will be up to date.

Which is odd, because neither has been true for well over a decade. In fact, they may never have been true.

In the 1970s it was a rule of thumb that every year there would be a new generation of hardware on the market, and it would be ten times the speed of the previous year.

Estimating computation speed has never been an exact science. You can chose CPU-cycles, gigaflops, or MIPS as your yardstick, all giving different results. And in any case, two systems which run at the same speed for one task might have very different results for another.

But as a rule of thumb, it worked. Every year, add a zero to your chosen measure of speed.

In the 1990s, the rule was: Every year, speed doubles. And the unspoken implication was: Speed will continue to double for evermore.

Which in retrospect made no sense at all.

If you imagine a nation where every 30 years the population doubles, then to feed everyone you'd need the production of food to double in the same time.

But if it takes X amount of ingenuity to multiply food productivity by two...it will take X times 2 ingenuity to multiply it by four. Which means every 30 years, you have to be twice as clever. And then twice again.

Like the Red Queen said, you have to run twice as fast just to stay where you are.

Even if you're dumb enough to think you can double scientific progress by doubling the number of scientists - or doubling their wages - it's still not sustainable.

In any case, the implicit assumption would be that scientific progress is a never-ending road, which itself assumes that the universe is infinitely malleable, if only we can get smart enough. Infinitely smart in fact - which I'm pretty sure is a meaningless phrase.

Odd how we can believe obviously false things just by not quite stating them explicitly. Clarity is the enemy of the ideologue.

The world of computing hit the brick wall of unmalleable reality around the year 2000. Unless someone found a way to increase the speed of light, or shrink atoms, electricity couldn't be made to flow faster through wires.

We got hyperthreading - a way to shoehorn two short instructions into the space of one long instruction.

We got the botched implementation of 64-bit processing, which had the big selling point that arithmetic with literally astronomical figures was now slightly easier.

And we got multiple cores - a tacit admission that CPUs couldn't be made faster...and the best we could do was run several in parallel, hoping that those tasks which didn't have to be run in sequence could run concurrently...and the results reintegrated by clever shuffling.

My 64-bit laptop has eight cores. Which is to say it has four hyperthreaded CPUs.

Which leaves only the problem that most software is still 32-bit, and most either can't handle multicore processing, or runs tasks that can't be split into several parallel streams.

The focus in computer development is now away from vainly trying to squeeze ever decreasing efficiencies out of silicon and copper.

The USB3 standard is ten times the speed of USB2.1 - but the new faster data streams still have to be funnelled through the same CPU.

Screens are wider - mine is 17.5 inches. Good for watching widescreen TV - in fact it's rather difficult to buy a small flatscreen plasma monitor for your PC. They can't make them much better, so the new notion of an upgrade is to make them bigger.

Mouses, MIDI keyboards and headsets can now be wireless. Which means you get the slight convenience that wires which occasionally got tangled...now don't. And you can now sit on the other side of the room, listening to your MP3s through bluetooth.

Old, obsolete, slow processors are reincarnated in netbooks and iPhones - where you can do what you did 20 years ago, at the speed of 10 years ago, but mobile, and on a very small screen. And pay for the experience.

It's a similar story with software, and operating systems.

Windows 8 was essentially Windows 7 with a new interface. An incredibly annoying, hard to use, badly thought-out interface, marketed on the bizarre assumptions that (a) everyone had touchscreen technology, (b) everyone wanted to use touchscreen technology, and (c) using a touchscreen to operate software designed to use a mouse...was somehow better.

It also pointlessly rearranged the configuration options, on the grounds that moving around the items in your shelves is the next best thing to getting better items.

Windows 8.1 made the further advance of bringing back some of the useful features which 8 had taken out.

"Bloatware" is the name we give to small, useful programs which acquire large, useless extra features. The alternative to bloating is to repackage the same program with the exact same functions, but with snazzier graphics and call it an upgrade.

Which is probably why our second belief is false. People aren't up to date.

Why use MS Office 2014 when MS Office 2003 does everything you could possibly want? Actually, MS Office '97 was all you needed, and it almost never crashed - but it won't install on your new system.

Why use Reason 7 when Reason 5 runs without a freaking dongle, and everything Reason 7 can do that Reason 5 can't do...you do anyway in Reaper?

More to the point, why install new versions of old Firefox plugins, knowing the new versions cause crashes with your other plugins?

I grudgingly moved from Windows XP to Windows 7, mainly becuase my new 8-core widescren laptop cannot run XP. The BIOS simply doesn't have the option to use the old IDE filing system - it can only use the admittedly better AHCI.

So I'm now running Windows 7 made up to look like XP.

Except half of my XP programs won't run under 7 - in spite of Microsoft's insistence that they can, in 7's "compatibility modes'.

Which means, for those XP programs which don't have a 7 replacement, I'm running them in a virtual machine.

And finally, being up to date is expensive. And most people haven't got much money.

Out of date software that people still use is called "legacy" software, and the dirty open secret is that pretty much everyone is a legacy user.

Like in Blade Runner, the future is old. In the future, we'll all live in the past.


3 comments:

  1. The future looks very wet and rainy in Blade Runner. I'll be sure to pack my umbrella.

    Forget Microsoft Office; try Libre Office--the freeware alternative.

    ReplyDelete
  2. I use AbiWord for word processing and Gnumeric for spreadsheet stuff. Both free and both portable - but yes, I've heard good things about Libre office too.

    ReplyDelete
  3. Music ?
    Just asking, don' t get mad.

    ReplyDelete