D&C GLug - Home Page

[ Date Index ] [ Thread Index ] [ <= Previous by date / thread ] [ Next by date / thread => ]

Re: [LUG] Re: Perl

 

Martin White wrote:

With reference to the last point, software hasn't really needed to be small
or quick in the last, I dunno how many years! Developers in certain fields
(I don't think it would be fair by any means to say globally) have been
spoilt.

The CPUs were getting ever faster and the amounts of memory people were
stuffing into their PCs were getting ever vaster.


I don't necessarily agree that the need for software to be small or quick has gone away simply because we now have faster PCs, Macs etc which can run rings around earlier computers. Take the example of the nemesis Windows for example. On my first ever PC, a 386 DX40 with a whopping 4Mb RAM Windows 3.1 took a minute or so to boot. I now have an Athlon 2400+ with 512Mb of RAM and Windows XP (when I do use Windows that is). The combination on paper is at least 384 times faster without taking into account FSB, increased memory speeds etc. Is it any quicker? Can I load Windows in .32 of a second (120 secs / 384)? Can I heck.. It's *slower* in real time than the old 386. So much for progress.

When I was coding on the Atari ST which admittedly is a long time ago, the principles I followed were simple. Keep it modular and KISS. If I gave it to a friend who was programming at the same time and he couldn't follow what I had done then I deemed it too complicated and went back to the drawing board. I do admit to reusing modules again and again (why reinvent the wheel?) however if I did reuse a module or used one I had found in another program I always checked it to see if I (still) thought it was the best way to solve that particular issue.

Businesses necessarily put a cost value on time. Why should they bother lashing out profits on say Windows XP on a 3Ghz Pentium4 when their old PII 450 runs Windows 2000 at the same or better yet 98 at a faster speed? (In this context of course I refer to speed in minutes rather than Mhz :))

If modern Windows was written with the same efficiency of programming as Windows 3.1 then by now Windows would load in seconds not minutes, as would the associated applications.

It's the same argument as the guy who buys a 15 seater minibus, fills it up with his mates and drives it at full speed *because he can*. If he'd bought a sensibly sized car and drove it at a sensible pace then a) he'd have less speeding tickets, less accidents and cheaper insurance, b) better fuel economy, c) the vehicle would last longer, etc.

If modern applications and OSs keep stretching the ability of the hardware to perform then as hardware developers keep "overstretching the plumbing" to compensate, I think we'll see a lot more instability issues.

I agree with the point that hardware development is hitting or has hit a brickwall. Therefore the only way to make the computers operate faster now in real time, in other words finish the same tasks quicker, is to give the same equipment less to do with cleaner simpler code.

I suppose what I'm taking a long time to say is simply that just because you have a 120Gb hard disk and an 8 hour working day there is no rule that says you *have* to fill both with programs and/or processing time.

Just my fourpence (OK more like a shilling :))

Kind regards,

Julian

--
The Mailing List for the Devon & Cornwall LUG
Mail majordomo@xxxxxxxxxxxxx with "unsubscribe list" in the
message body to unsubscribe. FAQ: www.dcglug.org.uk/linux_adm/list-faq.html