D&C GLug - Home Page

[ Date Index ] [ Thread Index ] [ <= Previous by date / thread ] [ Next by date / thread => ]

Re: [LUG] CCC website hacked

 

On Sat, 2008-06-14 at 08:21 +0100, Tom Potts wrote:
> On Saturday 14 June 2008 00:00, tom wrote:
> > David Brook wrote:
> > > It is not only those at the top who are to blame - in fact I am not in
> > > the slightest convinced by the argument that the head of an IT department
> > > needs to know how to write a program.

He should at least know what can be programmed and what is just
pie-in-the-sky. (As many probably don't know, then the requirement is to
ask first but then it's probably unlikely.)

> > On the other hand you could have a Director of IT who DOES know how to
> As for the IT director who does know how to program - never met one, met a lot 
> that said they could. I'd argue that theres a fault line just below that 
> level:
> engineers tend to produce things that are a result of logic, or something 
> resembling logic flow.*
> management mainly does things for political reasons - decision made, attempt 
> to provide logic later, or pass buck if choice has no logical solution.

<dilbert>
management makes decisions??
</dilbert>

> *Anyone do top down/bottom up design? Roughly outlined, you take a problem and 
> try an subdivide its functionality until eventually you have a lot of lowest 
> level functional blocks stuck together making up the solution.

Yep. Left to their own devices, most programmers approach design with a
"let's break this down" attitude. Unix has a tradition of fixing
problems by collecting small tools into a workable collection.

What mystifies me sometimes is why the "high profile" open source
applications (yes, I'm looking at you Ooo and Mozilla) make such large
lumps instead of spinning out lower level stuff and using more of the
existing libraries.

> With M$ there are no low level functional blocks - just malfunctioning middle 
> tier!

The malfunctioning tier is the backwards-compatibility - they try to
break it with Vista, nobody buys Vista: catch 22.

The secret to ongoing updates is access to the source code but M$
haven't learnt the lesson yet. With free access to the source code
(including the ability to make a fork of a dead project), the
third-party utilities and niche applications can be updated and
recompiled by others to make versions for the latest kernels/system
libraries. Eventually, if there is insufficient interest in a particular
app or library, it can be dropped. After a while, the only packages left
using an old interface (like Gtk1) are those nobody wants to update, so
they can be dropped along with the interface. There is no need for
backwards compatibility because the applications that people care about
can be updated by those who need them (access to the source code) and
those about which nobody cares a jot anymore can simply be dropped. Of
course, during the migration, the OS must support both interfaces - Gtk1
and Gtk2 in the case of GNU/Linux - it helps if the interfaces
themselves are properly compartmentalised so that you don't end up with
huge levels of bloat.

The result is that backwards-compatibility is a gradually moving target
instead of a fixed barrier that is forcibly raised at each release. The
target moves with the packages according to the level of support each
package retains. The barrier cuts off packages whether they are needed
by a million users or none because the lack of source code forces the
removal of the interface, instead of a package-level migration. The
packages then need to be updated after the release, leading to the
current behaviour where users needing those apps still require XP until
they can have updated packages that work on Vista. By forcing everyone
to migrate at the same time you get a logjam effect because everyone is
waiting for someone else to migrate their code - "everybody knew that
somebody had to do it but in the end nobody does it".

With access to the source code, backwards compatibility can be managed
on a package-specific basis and migrations can be gradual, not forced.
With proprietary systems, backwards compatibility has to be managed by
removing entire interfaces, rather than migrating individual packages
according to the level of interest.

It gets worse when the reason for removing an entire interface is not
entirely obvious to the programmers of the third-party utilities
(because they didn't see the source code of the old one) who (therefore
rightly) suspect that the interface was removed merely to satisfy some
political point-scoring inside a monopolist organisation, leading to a
reluctance to "play the game" by upgrading.

-- 


Neil Williams
=============
http://www.data-freedom.org/
http://www.nosoftwarepatents.com/
http://www.linux.codehelp.co.uk/


Attachment: signature.asc
Description: This is a digitally signed message part

-- 
The Mailing List for the Devon & Cornwall LUG
http://mailman.dclug.org.uk/listinfo/list
FAQ: http://www.dcglug.org.uk/linux_adm/list-faq.html