random Progress of general purpose computing

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
Hi Guys,

So I sit here at work waiting for some unit tests to complete (they take about 5 minutes) after doing some code modifications, and I wonder about how resource heavy our modern software is.

(Disclaimer: these thoughts were sparked after reading an article on /. today on a ray tracer written in 254 bytes)

So at my desk is a machine (i7-860, 8GB RAM, 250GB SSD, about 4TB of spinning rust, Win7), and on start up it's using 1.1GB of RAM (no apps open, just logged in), and with my IDE open (Eclipse) it's using 1.6GB of RAM. Just about every app I run wants 50MB+, eg Chrome - 2 tabs - 220MB, Outlook a measly 35MB... and I count at least 8 auto-update applications (java, acrobat, AV, windows update, nvidia update, bing desktop update, google update x2).

So when did having 4GB of RAM become the minimum and WTF does modern software need all these resources to accomplish what we did with a 10th of the resources years ago. (S^&t some might argue, 640K is all we really do need).

My home desktop, yes runs Solaris 11 a server orientated OS, it only uses 480MB on start-up, and yet seem now to struggle with only having 4GB of RAM** when I want to do more than a few things at once. (I'll except out the use of VMs - I do have a Win8 VM running that needs at least 2GB to be usable), but with firefox, thunderbird, pidgin and eclipse or netbeans open I just keep running low on RAM...

I remember the day when having 4MB was huge (for DOOM), and I completely understand RAM requirements when using large images or files or databases... but for code, I thought we were better than that. (My current Uni project needs about 120GB of RAM for processing genome sequences, yet the code itself - a measly 120KB in C++)...

I don't know, maybe I'm in the wrong industry, or there is a LOT of really crap people in this industry who still believe that we can just buy more RAM. (I wait for the day when a virus scanner needs 4GB of RAM just for itself)!

Sorry, this is just a rant whilst I'm waiting for tests to run...

**My current home desktop has 4GB of RAM (4x 1GB Reg ECC DDR2-800), and last I looked maxing out the RAM to 16GB would require roughly the same investment as a new i7 box with new motherboard, SSD, PSU, Case, etc, and with double the RAM! Hard to justify to the wife since I'm still only working part time whilst I finish my degree.
 

mubs

Storage? I am Storage!
Joined
Nov 22, 2002
Messages
4,908
Location
Somewhere in time.
As a guy who wrote complex payroll programs to run in 16k (yeah 16k, IBM 1401 mainframe, Autocoder assembly), I fully understand your frustration. Some years later, when I was programming Data General Series, their IDE would generate a 2MB executable for a one-line program, "Print .Hello'". It's been a long way down since then.

Steve Gibson thinks like you do and laments the current situation. He is a fan of of assembly programming and has some examples on his web site.
 

jtr1962

Storage? I am Storage!
Joined
Jan 25, 2002
Messages
4,168
Location
Flushing, New York
I don't think compilers these days even attempt to optimize things. I've been programming microcontrollers for the last few years. One thing I've noticed is often when I write a routine in assembly it will run MUCH faster than doing the same thing in a higher level language and letting the compiler generate the assembly code. If it's this bad for simple things like microcontrollers which by definition have scant resources (the 16F628A I use has 224 bytes of RAM) and really need optimization, I can only imagine how bloated things are at the x86 level. Really, I think the entire situation needs to be looked at from the top down. There's no reason why if I have a browser open with three tabs it should be using half a gig of RAM (as Chrome is doing right now).
 

Chewy509

Wotty wot wot.
Joined
Nov 8, 2006
Messages
3,327
Location
Gold Coast Hinterland, Australia
I love programming in x86 assembler as a hobby, but my day job and most Uni work is Java. (I started my coding in 6510 assembler on the C64).

I have nothing against Java (I actually quite like the language - except no unsigned int's - that pisses me off), and believe the JVM is a really good piece of engineering, but the amount of additionally libraries one tends to see being used - holly crap. And I wonder why we have software bloat...

Just look at the amount of space Windows uses for DLLs, etc just for itself... amazing... Mind you, most consumer orientated Linux distro's are just as bad these days, but at least that 8GB+ of HDD used is the entire OS + Apps + Games + ..., not just the OS itself.

No wonder Raspberry Pi and other hobby embedded systems are taking off again...
 

Mercutio

Fatwah on Western Digital
Joined
Jan 17, 2002
Messages
21,564
Location
I am omnipresent
A LOT of general purpose computing these days is exceptionally complex because we're in essence running miniature operating systems that sit on top of operating systems in the form of web browsers. Shit, Chrome has its own natural language speech processor now. Web browsers have an AI/Expert system component that's devoted to figuring out what the hell we mean if we don't explicitly type a web address in the address bar. They have their own complex memory management, process and security models. That's pretty heavy stuff

A tremendous amount of code isn't really native to the system but abstracted to a vastly higher-level. We write VBA to be interpreted by MS Office or .Net code or Javascript or Java or Flash, which amount to a virtual machine for one particular type of code. We do that to maintain portability and to (hopefully) improve security, but of course the trade-off is that we're not writing 20k binaries that can run on the resources present on the average feature phone any longer.

Yes, we could still get our hands dirty and write C to do everything, but who wants to deal with memory management or reinvent a 300-line-of-code wheel that's already available in a nice-but-inefficient 2-line call to a .Net library?
 
Last edited:

Howell

Storage? I am Storage!
Joined
Feb 24, 2003
Messages
4,740
Location
Chattanooga, TN
Memory is cheaper than employee time to write code in assembler. There is money to be made in optimizing code especially as we cram more people onto fewer resources through virtualization.
 

P5-133XL

Xmas '97
Joined
Jan 15, 2002
Messages
3,173
Location
Salem, Or
It was drilled into me a long time ago: HW is very cheap and there's always lots of surplus capability built in, but programmer time is expensive so let the user worry, if it takes 10 sec longer and an extra 100 Meg.

Really, any programmer cares a lot about the quality and efficiency of their code! The problem is that that is not the only POV involved for there's also the people that write their checks and for them they care much more that the code didn't cost too much to write and that it works, most of the time.
 

Howell

Storage? I am Storage!
Joined
Feb 24, 2003
Messages
4,740
Location
Chattanooga, TN
As Merc alluded to, it may be phones and other battery limited devices that push us toward more efficient code.
 
Top