[geeks] OSX Server

Charles Shannon Hendrix shannon at widomaker.com
Sat Mar 19 00:04:56 CST 2005


Fri, 18 Mar 2005 @ 12:33 -0800, Francisco Javier Mesa-Martinez said:

> On Fri, 18 Mar 2005, Charles Shannon Hendrix wrote:
> 
> > It is bloody stupid for a program which sends a few bytes of command
> > information to a mixer device to need 24MB of code + libraries to do
> > that job.
> 
> I was never denying that... just trying to understand how much of that
> size was the applet itself, vs. whatever is shared. On top of that Linux
> has at some point done some really retarded things regarding memory
> allocation to boot. :)

Like what?

UNIX in general has done amazingly stupid things with regard to both VM
and I/O.

Most mainframe/VMS/TOPS people don't even consider UNIX to *have* an I/O
subsystem.

> > Also, when you have huge shared libraries that every program you run
> > needs, it puts excessive pressure on the VM subsystem.  It either
> > can't find opportunities to page them out, or it can't do so long
> > enough to avoid performance problems.
> 
> Hum, I do not see it that way.... the VM only sees pages, so one the parts
> of the shared library that have low usage statistics will be swapped out.

I think you missed my point.  I'm not talking just about shared
libraries.  They have pros and cons, and usually they are a good thing.

What I am talking about is "HUGE shared libraries that EVERY
PROGRAM...needs".  

Gnome has a large number of shared libraries, some quite large, and most
of the programs use nearly all of them.

Gnome is highly interactive and it frequently loads "helper" execs.

What this means is that a signficant portion of the library code is in
demand, so it doesn't get paged out.

More modular libraries, moving stuff out of Gnome that doesn't belong
there, and a design that allowed programs to leave a lot of them out
would increase (perhaps greatly) the probability of library pages being
paged out.

> The overall cost for me, as direct hit to my pocket at time of
> adoption at least (and perception is a very important part of the
> equation) is very minimal.

That's why I was saying it isn't a direct cost.

> OSX had some serious growing pains. Still does some retarded things, but
> their APIs are light years ahead of GNOME that is for sure. OSX also does
> things like prebinding and whatnot, that really put a large price tag on
> memory resources it may need.

A lot of it is the graphics subsystem, which was designed to be a
long-term win.  They seem to be right about that so far.

What bothers me most right now is some of the really bad GUI design in
MacOS X.  Some of it is inferior to NeXT and MacOS.

Overall most of it is good, but some of it makes you say "huh?"

I had a URL that discussed the MacOS X GUI mistakes, but I've lost it.

> I tend to put GNOME in its real context, as a project sometimes it annoys
> the F out of me how they do things. And looking at some of the code, I was
> actually appalled. The good thing is that some people can try to fix that
> if they so chose (good luck), they seem to have some talented people...

That is its saving grace.

To me, if Sun, IBM, etc were really serious, they would be pushing hard
for a more professional development track with Gnome.

Novell is making tentative efforts by offering bounties for things like
memory leaks, but its a really wimpy effort right now.

> but the overall design is just monstruous IMHO. But it is always nice
> to have a choice I guess. There is GNOME, KDE, WxWindows (which I
> actually prefer as my target for gui stuff), GNUStep, heck even xlib
> :), you also get your cocoa, or Win32...

I keep hoping GNUstep will get really good, and Apple won't stomp on it.

> Motif just like GNOME is what happens when you have something designed
> by a commitee :).

The early 2.0 series of Motif was a real mess.

It's a lot better now.

Personally, I never minded Motif, and I don't think it is ugly.

I prefer stable GUIs over those whose developers spend more time on
themes than they do reliability and usable features.

> What I see as progress is that one can develop rather complex gui
> applications relatively quickly, vs. the old ways.

I feel like rapid application development is one of the leading causes
of software quality degradation.  It has a place of course, but it seems
like now it is the driving force for everything.

It seems to have encouraged a huge amount of programs with little or no
planning or design work.

I think it is a long-term loss.

It focuses primarily on reducing the time needed for coding.  That's
fine for churning out well known apps, but most non-trivial projects
should involve mostly design and testing, not coding.

Of course, PHB's see something run and they say "Ship it!"

> It is not efficient nor pretty I guess. But the point of progress is
> usually to move from point a to point b, whether or not the path taken
> is the most elegant/efficient sadly it is not usually a metric for the
> forward motion IMHO.

In the long run I believe the industry is *increasing* the time to
move from point A to point B, or at least increasing the fixes and
maintenance that most be done after you arrive.

> Sure, there is bloat (I don't know about hte avoidable bugs), but
> memory densities get higher, and so do processor speeds. So when
> designers are faced with spending time squeezing performance vs. time
> to market or development time, well... one has to ride the technology
> curve I guess.

Seems to me that better foundations and better programmer training would
also decrease time-to-market.

Also, while a lot of code can be written faster these days, I think
in the long run you end up spending even more time on fixing and
maintaining it.



-- 
shannon "AT" widomaker.com -- [governorrhea: a contagious disease that
spreads from the governor of a state downward through other offices and his
corporate sponsors]



More information about the geeks mailing list