Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Things That Turbo Pascal Is Smaller Than

timothy posted more than 2 years ago | from the celestial-emporium-of-benevolent-knowledge dept.

Programming 487

theodp writes "James Hague has compiled a short list of things that the circa-1986 Turbo Pascal 3 for MS-DOS is smaller than (chart). For starters, at 39,731 bytes, the entire Turbo Pascal 3.02 executable (compiler and IDE) makes it less than 1/4th the size of the image of the white iPhone 4S at apple.com (190,157 bytes), and less than 1/5th the size of the yahoo.com home page (219,583 bytes). Speaking of slim-and-trim software, Visicalc, the granddaddy of all spreadsheet software which celebrated its 32nd birthday this year, weighed in at a mere 29K."

cancel ×

487 comments

Sorry! There are no comments related to the filter you selected.

Quite sad how bloated everything is (0)

suso (153703) | more than 2 years ago | (#37907014)

When I think back to playing vast adventure games, like Below the Root, that amazingly fit on two sides of a 5.25" floppy, but the same game now would probably be written to take up a CD-ROM, even using the same graphics. Programmers have lost the ability to optimize.

Re:Quite sad how bloated everything is (4, Insightful)

fragfoo (2018548) | more than 2 years ago | (#37907078)

When I think back to playing vast adventure games, like Below the Root, that amazingly fit on two sides of a 5.25" floppy, but the same game now would probably be written to take up a CD-ROM, even using the same graphics. Programmers have lost the ability to optimize.

I think they have the hability (some of them at least) but don't have the need or the time.

Re:Quite sad how bloated everything is (2)

azadrozny (576352) | more than 2 years ago | (#37907854)

I agree. There is balance that needs to be struck here. If a developer is earning $40 to $50 per hour, is it worth it for him to spend more than 2 or 3 hours reducing the memory foot print of his application, by even 20%, when I can just slap an extra 4 gig in the server for about the same cost? Programmers do know how to optimize, in my example it is the use of their time.

Re:Quite sad how bloated everything is (1)

Chrisq (894406) | more than 2 years ago | (#37907088)

Programmers have lost the ability to optimize.

The ability and the need (programmers of embedded systems may be an exception). I think nothing of allocating an input buffer that is larger than the entire memory of the first machine I worked on (a z80 box with 2k memory).

New NES games (2)

tepples (727027) | more than 2 years ago | (#37907144)

The ability and the need (programmers of embedded systems may be an exception).

That and dedicated TV games [wikipedia.org] , which commonly use [wikipedia.org] an architecture not unlike that of the Nintendo Entertainment System. New games are still being developed for the NES [nintendoage.com] , and many are roughly the size of Turbo Pascal or smaller.

MOD PARENT DOWN!!! (-1, Offtopic)

Ossifer (703813) | more than 2 years ago | (#37907214)

Crazy. Even today I see review code where programmers have implemented exponential sorting algorithms. It's almost as if the existence of faster CPUs and larger memory has enticed some to be extremely lazy. But it's never enough for some. Example, my MacBook Air goes from "ding-chord!" to signed in and usable in about 15 seconds. For Windows 7 on the very same machine the number is about 3:30, including 30 seconds of watching a fucking white cursor blink on an otherwise black screen! What the hell is it trying to do?

Re:MOD PARENT DOWN!!! (1, Informative)

boristdog (133725) | more than 2 years ago | (#37907470)

Obviously Your machine is poorly configured. My $400 Dell notebook starts Win7 in about 15 seconds.

Re:MOD PARENT DOWN!!! (1)

Ossifer (703813) | more than 2 years ago | (#37907668)

I call BS. You can't even get through the BIOS bullshit on a Dell in 15 seconds.

Re:MOD PARENT DOWN!!! (0)

Anonymous Coward | more than 2 years ago | (#37907560)

Horray for using anecdotes for data! My Win7 laptop is usable in ~45 seconds. Its 4 years old and cost less than the cheapest Macbook Air on the market so go blow it out your hole.

Re:MOD PARENT DOWN!!! (0)

Anonymous Coward | more than 2 years ago | (#37907620)

Read the guidelines on moderating. Don't mod a post down just because you disagree with it. There's nothing wrong with GP's post. Similarly, your post shouldn't be modded down, either, despite the fact that you somehow screwed up your Windows 7 installation to be so slow.

Re:MOD PARENT DOWN!!! (1)

Ossifer (703813) | more than 2 years ago | (#37907712)

If you call "installing Win7 by choosing the default options and attaching to a local domain" screwing it up, I guess I screwed up by choosing M$ products...

Re:MOD PARENT DOWN!!! (3, Informative)

b4dc0d3r (1268512) | more than 2 years ago | (#37907866)

That's your probelm, right there "attaching to a local domain". Windows does piles of things when attached to a domain it otherwise doesn't do. It seems slow, but most likely it is a bunch of network timeouts waiting for something that will never happen.

Quite simply proven really. Put in the wrong password on a non-domain computer, and it comes back instantly. Same on a domain computer, and time it. It first has to check to see if the domain controller is there, if there is a new password, and then fall back on the locally cached hash.

It is also constantly sending out device discovery information, publishing and receiving info about who has printers and such, and on startup this information has to be collated from scratch (or so the OS thinks).

You can look into administration a little and optimize your startup to stop doing some of these things, which I would recommend even if you don't care abount speed.

Re:MOD PARENT DOWN!!! (0)

Anonymous Coward | more than 2 years ago | (#37907922)

No, that's not what anyone meant. They mean your machine is screwed up - some software package or driver (or combination of them) has your system borked. A normal Windows 7 notebook boots in about 45 seconds like a poster about mentioned. When you add things like corporate domain stuff (GPO, redirected folders, SCCM client, etc.) it can still be done in about 50 seconds if you optimize your images carefully. (Our corporate one loads in about 48 seconds today). Three minutes and 30 seconds for your MacBook shows that something is seriously wrong. There are some corporate images that do take that long (when their IT staff hate their users and don't know about tools like xperf), but a home use one should never be anywhere close to that long.

Re:MOD PARENT DOWN!!! (1)

hawguy (1600213) | more than 2 years ago | (#37907762)

Crazy. Even today I see review code where programmers have implemented exponential sorting algorithms. It's almost as if the existence of faster CPUs and larger memory has enticed some to be extremely lazy. But it's never enough for some. Example, my MacBook Air goes from "ding-chord!" to signed in and usable in about 15 seconds. For Windows 7 on the very same machine the number is about 3:30, including 30 seconds of watching a fucking white cursor blink on an otherwise black screen! What the hell is it trying to do?

Do programmers still implement sort algorithms? I thought whatever library/framework they're using took care of sorting for them?

Re:MOD PARENT DOWN!!! (3, Informative)

zblack_eagle (971870) | more than 2 years ago | (#37907776)

That 30 seconds of cursor blinking? The bootloader hack that you used to make your unlicensed copy of Windows 7 think that it's genuine is waiting for feedback from the bios. The 30 seconds is how long it takes to give up and continue booting. There are other/newer hacks that avoid that issue.

39K ? Luxury! (3, Insightful)

goombah99 (560566) | more than 2 years ago | (#37907096)

Back in my day we had Basic running on a 2K altair. Kids these days don't know the meaning of a kilobyte.

Re:39K ? Luxury! (0)

Anonymous Coward | more than 2 years ago | (#37907162)

1024 bytes if you are sane
1000 bytes, if you are a drive maker.
10 - 20 years if you have an intent to distribute.

Re:39K ? Luxury! (2)

asc99c (938635) | more than 2 years ago | (#37907428)

> Kids these days don't know the meaning of a kilobyte.

Shouldn't that be a kibibyte?

Re:39K ? Luxury! (5, Funny)

Anonymous Coward | more than 2 years ago | (#37907604)

Only if you enjoy choking on penises.

Re:39K ? Luxury! (3, Insightful)

Ferzerp (83619) | more than 2 years ago | (#37907636)

No, it most certainly should not. That forced nomenclature is worse than what it ostensibly tries to solve.

Re:Quite sad how bloated everything is (1)

Zedrick (764028) | more than 2 years ago | (#37907190)

> Programmers have lost the ability to optimize.

(which is very sad, I agree)

When I have children, they'll only have access to a C64 with a Retro Replay or possibly a Chameleon Cart. That way, they will learn how a computer actually works, unlike kids (those under the age of 30) today

Re:Quite sad how bloated everything is (1)

Anonymous Coward | more than 2 years ago | (#37907286)

No, they will learn how an obsolete computer works, if they even bother with it, which they almost assuredly will not.

Rest assured, you won't do that. Virtually every prediction of "when I have children" is usually just some excuse to pontificate about how other people raise their own kids, and every one of those falls by the wayside of practicality when the little bundles of joy actually arrive. You will compromise, because I bet in the end your 8-bit-forever principle isn't actually that important to you.

Oh, same rant goes for "I would fire any employee who ______". If you've never been a hiring manager, just keep it to yourself.

Re:Quite sad how bloated everything is (1)

tepples (727027) | more than 2 years ago | (#37907742)

No, they will learn how an obsolete computer works, if they even bother with it, which they almost assuredly will not.

I have a cousin in high school who in his spare time develops video games for obsolete computers because making graphics for them is a lot easier than making graphics for a 3D engine.

Re:Quite sad how bloated everything is (4, Insightful)

plover (150551) | more than 2 years ago | (#37907234)

How easily we overlook the difference between "bloated" and "quantity of useful information".

Just the words on this page (no markup, no graphics, and after a few comments) would have exceeded the capacity of your beloved 5-1/4 floppy. That's only the raw information, without bloat.

My first screen (a DECScope) had 12 lines x 80 columns each (I couldn't afford the 2K RAM that would have given me 24 x 80.) The screen I'm reading this on can display over 2 million RGB pixels. Calling things "bloat" is like telling me I should honor a display that's less than the size of the "close [X]" icon, because 12x80 isn't "bloat".

By the same twisted logic, Turbo Pascal itself was bloatware, and I thought it produced horribly slow and big code. Assemblers were where the real efficiency lay, and they were a lot smaller than 39K.

Nostalgia is fine. But leave it in the past.

Re:Quite sad how bloated everything is (1)

skids (119237) | more than 2 years ago | (#37907330)

Just the words on this page (no markup, no graphics, and after a few comments) would have exceeded the capacity of your beloved 5-1/4 floppy. That's only the raw information, without bloat.

I dunno, sometimes there seems to be a lot of bloat even in the raw data around here.

Re:Quite sad how bloated everything is (1)

Eraesr (1629799) | more than 2 years ago | (#37907624)

Mod parent up.
I've been thinking the exact same thing.
I saw the comment where someone used some video game that fit two floppies back in in 198x as example of how programmers forgot how to optimize. Tell that to the developers of Rage [wikipedia.org] (a game that comes on 3 DVDs) who manage to stream hundreds of megabytes of texture data at 60 frames per second.

Stylized games need less VRAM bandwidth (1)

tepples (727027) | more than 2 years ago | (#37907768)

Tell that to the developers of Rage (a game that comes on 3 DVDs) who manage to stream hundreds of megabytes of texture data at 60 frames per second.

Perhaps the point is that an art style that requires "stream[ing] hundreds of megabytes of texture data at 60 frames per second" isn't the only art style for a fun video game.

Re:Quite sad how bloated everything is (2)

foobsr (693224) | more than 2 years ago | (#37907644)

Turbo Pascal itself was bloatware, and I thought it produced horribly slow and big code.

Different memories here, and Wikipedia gives support: "The Turbo name alluded to the speed of compilation and of the executables produced. The edit/compile/run cycle was fast compared to other Pascal implementations because everything related to building the program was stored in RAM, and because it was a one-pass compiler written in assembly language. Compilation was very quick compared to that for other languages (even Borland's own later compilers for C),[citation needed] and programmer time was also saved since the program could be compiled and run from the IDE. The speed of these COM executable files was a revelation for developers whose only prior experience programming microcomputers was with interpreted BASIC or UCSD Pascal, which compiled to p-code."

CC.

Re:Quite sad how bloated everything is (1)

Greyfox (87712) | more than 2 years ago | (#37907546)

In my senior year of High school back in the late 80s, I wrote a graphing program for the Apple II using Apple Pascal and Turtlegraphics. It let you enter in a bunch of numbers and labels and would draw a bar, line or pie graph of the results. Well the bar and line graph portion managed to fit on a single floppy, and I had to move the pie graph one off to a separate one with the input routines copied. It was in the neighborhood of 20K and I had to swap every function out to floppy so everything would have room to run, but it worked!

But I just wrote a cheesy little demo app in C++ that reads an orbit file for a satellite, reads an ephemeris file for the same satellite and acts as a very simple http server so that it can be a network link to plot the current orbit and last-seen location of the satellite in Google earth. ps reports that it's using 16K of RAM. I wasn't going out of my way to make it small, though. There are lower hanging fruit to optimize than memory these days. Back in the day you didn't have a choice about fitting into the environment and had to make a lot of trade-offs to manage it. These days maybe you can allocate a few megabytes so you can keep a hash table in memory and eke out a bit more speed. Knowing what trade-offs to make and where you can optimize is still key. My cheesy little app would need to be a good bit bigger to be actually useful.

Killer App (3, Interesting)

GalacticOvergrow (1926998) | more than 2 years ago | (#37907054)

Visicalc was the first killer app as well. I remember people coming into the store and asking for Visicalc computers not knowing it was a program that ran on an Apple II.

Re:Killer App (0)

Anonymous Coward | more than 2 years ago | (#37907182)

2 years of glory, and even then Apple only managed 6% of the market share.

Re:Killer App (1)

Gordonjcp (186804) | more than 2 years ago | (#37907378)

You could fit Visicalc into the codespace of an Arduino. Good luck fitting much of a spreadsheet and the systems variables into 2K of RAM though.

Pascal v/s C (2)

frodo from middle ea (602941) | more than 2 years ago | (#37907084)

When I first started learning Pascal, I had a real hard time wrapping my head around the concept of a Pointer, and struggled with pointers and related concepts like linked-lists etc. throught out the time I had to use Pascal. I was using the Schaum's book on Pascal.

But under 'C' it felt somehow very natural and easy to understand, never had a problem with 'C' pointers, and data structures. R.I.P. DMR, your book really opened my eyes to the wonderful world of computing.

Re:Pascal v/s C (1)

elsurexiste (1758620) | more than 2 years ago | (#37907260)

Really? I felt it was the other way around: pointers on Pascal felt intuitive, while pointers in C made my programs more prone to failure.

Re:Pascal v/s C (3, Interesting)

satuon (1822492) | more than 2 years ago | (#37907418)

It was the same with me. I learned Turbo Pascal and knew about pointers, but only when I switched to C I realized that pointers are numbers, like indexes in an array.

There were a lot of things that were easier to understand in C than in Pascal. For example scanf and printf were just library functions, while in Pascal readln and writeln were parts of the language. Also, what "#include " did was perfectly clear - a simple text substitution, i.e. the same as if I had gone the header and copy-pasted its contents in the .c file, while in Pascal when you write "uses crt;" I wasn't sure what actually happens. The fact that text was an array of numbers was not clear to me while I was using pascal, what with all the Chr and Ord function to move from Character to Integer, and strings were part of the language and were like blackboxes.

Re:Pascal v/s C (2)

PeterM from Berkeley (15510) | more than 2 years ago | (#37907508)

That's because you learned what pointers were in Pascal first. I did that too, Pascal first, then C. C made sense because I knew what pointers were.

--PM

Re:Pascal v/s C (1)

muecksteiner (102093) | more than 2 years ago | (#37907616)

In my opinion, the Pascal syntax for pointers was less intuitive because the chaps who developed Pascal were not the sort of people who actually liked to have pointers in a HLL - or at least, who liked to use them there. Nikolaus Wirth was more interested in higher level concepts, but pointers had to be there for the language to be more of a plaything (remember LOGO?), so they were added. Without much love being lost on them in the process.

For DMR, on the other hand, pointers were part of a proper language ecosystem right from the get-go. It's no surprise that they feel more natural in C (the world's favourite macro assembler) than they do in Pascal... :-)

Use and abuse (1)

elPetak (2016752) | more than 2 years ago | (#37907102)

There was a time when you had to be very careful about the ram and the processing limitations.
You had to make sure you had enough space for your variables and make some weird stuff to get more done with less resources.
That's all gone now, hardware is relatively cheap so too little people care about optimizing the size of stuff.

Re:Use and abuse (1)

foobsr (693224) | more than 2 years ago | (#37907834)

hardware is relatively cheap so too little people care about optimizing the size of stuff

One might come up with a theory that postulates that marketing has developped a toolset to counter Moore's law, secretly handing out incentives to take care of bloating code being one evil.

CC.

Bytes? (5, Funny)

Applekid (993327) | more than 2 years ago | (#37907118)

Bytes are kind of weird. Can't they give these number in terms of Library of Congress?

Re:Bytes? (1)

skids (119237) | more than 2 years ago | (#37907386)

[deadpan]
I'm confused. They say the size was 24K. What's a K? Is that a typo, shouldn't it be M?
[/deadpan]

Fraction of an M (1)

tepples (727027) | more than 2 years ago | (#37907810)

Super Mario Bros. is 40 K. This means about 25 copies of it will fit in one M, and about 17,000 copies of it will fit in one CD.

So what? (2, Insightful)

Jeff Hornby (211519) | more than 2 years ago | (#37907128)

First thing that comes to mind is: so what? This whole argument that smaller is better is crap. The reason that software is bigger these days is that it does more for you. How productive was the GUI for Turbo Pascal (it sucked), how good were the other tools that came with it (nonexistent), how fast were the release cycles (about the same as today). So with what people call bloat we get better tools that make us more productive thereby driving down the cost of software development.

Or to put it another way: if you really don't like bloat, when are you going to trade in your car and start driving to work in a hot wheels?

Re:So what? (1)

mapkinase (958129) | more than 2 years ago | (#37907148)

Chill. It's a typical /. entry about how grass was greener (I testify, by the way, being a relative geezer).

Re:So what? (1)

msauve (701917) | more than 2 years ago | (#37907186)

"The reason that software is bigger these days is that it does more for you. "

"touch" does more than a Pascal compiler?

Re:So what? (1)

Spad (470073) | more than 2 years ago | (#37907228)

Or to put it another way: if you really don't like bloat, when are you going to trade in your car and start driving to work in a hot wheels?

I think a more accurate comparison might be to ask why you need a Hummer to take your two kids to school when you can do it equally well in a Ford Focus which is half the size and probably 10 times as fuel efficient?

The reality is that people don't optimize software because they generally don't have to outside of embedded programming. Even software for iOS and Android can be horrendously bloated in terms of the resources it uses. Interestingly, it's console hardware limitations that have been leading to better optimized games of late, because the devs are having to work extremely hard to keep the console versions looking even close to the PC; see Crysis 2, pre-DX11-will-melt-your-PC patch of course.

Buying one SUV vs. two vehicles (1)

tepples (727027) | more than 2 years ago | (#37907414)

I think a more accurate comparison might be to ask why you need a Hummer to take your two kids to school when you can do it equally well in a Ford Focus which is half the size and probably 10 times as fuel efficient?

If there are in fact situations where one needs a Hummer or similar SUV, is it cheaper to fuel an SUV than to buy an additional small passenger car for trips that don't need the SUV?

Re:Buying one SUV vs. two vehicles (1)

mark-t (151149) | more than 2 years ago | (#37907802)

But, in general, unless people live in a rural area, they still do not really need a large vehicle most of the time, and when they do, such periods are usually no longer than a weekend, with the possible exception of a single period of maybe a couple of weeks sometime during an entire year The question should then become whether it is cheaper to always be fuelling an SUV as a regular mode of transport than it is to periodically rent one for the times when actually you need it?

Re:So what? (0)

Anonymous Coward | more than 2 years ago | (#37907476)

Hummer gets about 10mpg. I'm pretty sure the Focus doesn't get 100mpg.

(The Smart4two is only rated at 41mpg highway.)

Re:Smaller IS better (2)

TaoPhoenix (980487) | more than 2 years ago | (#37907294)

Sorry, your post went a little south with the Vista debacle. Because MS had to fast track a recovery of a whole new architecture, the result was not optimized at all ... and netbooks got crushed. Windows 7 is more sensible because they did have time to snip out a lot of the junk code.

Currently we're disparaging the need for tight code, but give it one skipped cycle of Moore's law and suddenly the software side will have to take up the slack. Currently it's the mobile phones with their weaker processors that are preventing "dock your phone into a workstation shell" from being the universal desktop in your pocket.

I'd love it if someone managed to make a meme that Tight Code Stops Terrorists. Then watch the OS fly!

Re:Smaller IS better (1)

tepples (727027) | more than 2 years ago | (#37907452)

Currently it's the mobile phones with their weaker processors that are preventing "dock your phone into a workstation shell" from being the universal desktop in your pocket.

Are you sure it's just the processor, or is it Apple's restrictions on what gets accepted into its App Store?

Re:Smaller IS better (1)

Dishevel (1105119) | more than 2 years ago | (#37907778)

Yes. That is it.
Because all good phones are made by Apple.
There are no other decent companies that make phones or phone OSes.

Re:Smaller IS better (0)

Anonymous Coward | more than 2 years ago | (#37907876)

Are you sure it's just the processor, or is it Apple's restrictions on what gets accepted into its App Store?

[sneer] No, it's actually Google's plot to add advertising to your compiler. [/sneer] Apple isn't the only mobile phone around, y'know. Those don't do any better at being a workstation, and also have no "excuse" other than logical, realistic ones.

Geez, give the, "[BLANK] iz teh EEVUL!" partisan non-politics a rest and grind your axe elsewhere, 'kay? You could at least move it to a more fitting subject.

Re:So what? (1)

ShakaUVM (157947) | more than 2 years ago | (#37907412)

Intellisense/autocompletion is the only IDE tool that is vaguely useful (though vastly overrated). You can use vim, or hell, notepad, to do all your coding, for real projects, along with gcc and make.

Code is/can be compact (that's why demos can be so small) and even in today's world, there's still value in knowing how yo code compactly.

Re:So what? (1)

Pentium100 (1240090) | more than 2 years ago | (#37907548)

The reason that software is bigger these days is that it does more for you.

It depends on whether I actually want that "more". For example, I use mpc-hc for playing music files because I only want a media player, not a media player+CD ripper+MP3 encoder+CD writer+music store+something else. So, for me, iTunes are bloated because the software has all those functions I do not usually need. When I need them, I can start another small program to do that function.

Re:So what? (2)

Kozar_The_Malignant (738483) | more than 2 years ago | (#37907550)

>The reason that software is bigger these days is that it does more for you.

Like Clippy?

Re:So what? (1)

gr8_phk (621180) | more than 2 years ago | (#37907676)

First thing that comes to mind is: so what? This whole argument that smaller is better is crap. The reason that software is bigger these days is that it does more for you.

Smaller is usually better for a given functionality. Your point that software does more is valid too.

To interject with my own babble, why does a composited desktop require OpenGL? That stuff can be done with a few K of highly optimized C or ASM code. Another example of bloat in the name of easy to implement - never mind the added dependency.

Small, yes, but keep some perspective... (1)

JoeMerchant (803320) | more than 2 years ago | (#37907134)

Sure, these programs were small, but try to keep in perspective that they were leveraging the OS to get their compactness. It's kind of like saying a "Hello, World" GUI app is only 3 lines of code.... sure, 3 lines, plus 35 megs of library files running atop 1 gig of OS support. The .exe may only be 2k, but good luck getting that to do anything without serious support.

Re:Small, yes, but keep some perspective... (1)

Alioth (221270) | more than 2 years ago | (#37907278)

In the case of Turbo Pascal, probably pretty much everything was in that .exe. Don't forget there was no dynamic linking of libraries, let alone much in the way of libraries on MS-DOS in the first place. There would have been various int# calls within the executable to call the "operating system" but the OS (BIOS + DOS) itself wasn't all that much bigger.

Certainly in the case of the likes of spreadsheet programs for 8 bit systems, at most you probably had a 16K ROM in addition to the program itself, so still less than ~40K for the entire system.

Re:Small, yes, but keep some perspective... (1)

JoeMerchant (803320) | more than 2 years ago | (#37907630)

Even DOS provided a lot of high level file, screen and keyboard I/O functionality.

Yes, DOS itself wasn't that big, and, hey, I've implemented entire 8-bit systems with significant RS232 based I/O on 16K PROMs with no OS, so, yes, it can be done smaller.

I just get tweaked when somebody calls out "look ma, it's only 4K" and they're sitting on top of some ginormous library that's doing everything for them.

Re:Small, yes, but keep some perspective... (2)

tepples (727027) | more than 2 years ago | (#37907324)

OK, then let's try it on a Game Boy Advance, which has a 16 KiB BIOS. I wrote a simple terminal emulator plus font plus Hello World in C (or in C++ using only C features), and it compiled to less than 6000 bytes with no BIOS calls, and that's even without paying attention to optimization for space. But when I did the same using C++ <iostream> of GNU libstdc++, the file bloated to over 180,000 bytes.

Re:Small, yes, but keep some perspective... (1)

Brian Feldman (350) | more than 2 years ago | (#37907720)

That's because you pulled in swathes of new .o files. Nothing to do with the language; all compiler and static runtime library granularity limitations.

Re:Small, yes, but keep some perspective... (1)

flaming error (1041742) | more than 2 years ago | (#37907494)

"they were leveraging the OS to get their compactness"

You do realize 1986 Turbo Pascal ran on DOS, right? Back when developers wrote directly to the monitor and disk drives? When we had to have an intimate relationship with the stack? When there were no threads and one process owned the entire CPU?

Leverage that, young whippersnapper, and get off my lawn.

Re:Small, yes, but keep some perspective... (1)

JoeMerchant (803320) | more than 2 years ago | (#37907750)

DOS did more than people realize. I'm not saying Gates was a genius or anything, but there's a reason he got a lock-in, if DOS was truly useless, people would have re-coded their own drivers.

I seem to remember purchasing a third party graphics library in the early '90s, MetaWindows or something like that, it gave me a access to accelerated graphics that DOS wouldn't, without having to re-code for every graphics card on the market. But, DOS still took care of the file system and keyboard (I think MetaWindows handled the mouse...), I suppose that was the genius in DOS, doing just enough to maintain market lock.

Re:Small, yes, but keep some perspective... (0)

Anonymous Coward | more than 2 years ago | (#37907800)

Leveraging the OS? You don't seem to have any idea of what happened back in those days. Yes in recent years the 4K and 64K intro/demo stuff have a lot more OS, lib and driver support. But that's not in the days of Visicalc.

Back then many applications/games were practically their own operating systems - since there was not enough space. They had to do all the OS stuff they needed themselves, and leave out the other stuff.

For the Apple II+ there was just 64K of address space - Misc IO, ROM, Graphic screen buffers, text screen buffers, RAM all shared that 64K address space. If you wanted a "high res" screen, there goes about 8KB, if you wanted to draw on one screen while showing the other fully drawn screen (to make animation look better) you lost another 8KB for that second screen.

The floppies only stored from 143KB to 160KB per side (depending how aggressive you were at the sectors and track formatting). And floppies were already high tech compared to cassette tapes :).

See also: http://lowendmac.com/orchard/06/visicalc-origin-bricklin.html

What the hell?! (1)

wondershit (1231886) | more than 2 years ago | (#37907150)

Good job. Posting obvious shit and getting on the main of Slashdot. I hope I'm not the only one who thinks this sort of comparison is absolutely pointless. Why are we even comparing executables to web pages or images?

This just in (5, Insightful)

Opportunist (166417) | more than 2 years ago | (#37907152)

Software grows to fill the available ram.

Code is always a tradeoff between codesize, development time and ram needed for execution. I'm fairly sure you can optimize code today to a point that would put those programs (which were optimized 'til they squeaked to squeeze out that last bit of performance) to shame, but why? What for? 30 years ago, needing a kilobyte of ram less was the make or break question. When drivers weighed in the 10kb range and you still calculated which ones you absolutely need to load for the programs you plan to run, where you turned off anything and everything to get those extra 2 kb to make the program run. Today, needing a few megabytes of ram more is no serious issue. And mostly because it just really doesn't matter anymore. Do you care whether that driver, that program, that tool needs a megabyte more to run? Do you cancel it because it does? No. Because it just doesn't matter.

We passed the point where "normal" people care about execution speed a while ago. Does it matter whether your spreadsheet needs 2 milliseconds longer to calculate the results? I mean, instead of 0.2 you now need 0.202 seconds, do you notice? Do you care? Today, you can waste processing time on animating cursors and create colorful file copy animations. Why bother with optimization?

Because, and that's the key here, optimizing code takes time. And that costs money. Why should anyone optimize code if there's really no need for it anymore? And it's not the "lazy programmers" or the studios that don't care about the customers. The customers don't care! And with good reason they don't. They do care about the program being delivered on time and for a reasonable price, but they don't care whether it needs a meg more of ram. Because it just friggin' doesn't matter anymore!

So yes, yes, programs back in the good ol' days were so much better organized and they used ram so much better, they had so much niftier tricks to conserve ram and processing time, but in the end, it just doesn't matter anymore today. You have more ram and processing power than your ordinary office program could ever use. Why bother with optimization?

Re:This just in (1)

Tinctorius (1529849) | more than 2 years ago | (#37907246)

Why bother with optimization?

Energy consumption.

Re:This just in (1)

Opportunist (166417) | more than 2 years ago | (#37907420)

Explain please how using less ram and less HD space saves any energy.

Making a new computer costs energy (1)

tepples (727027) | more than 2 years ago | (#37907556)

Using less space on main storage allows the use of an SSD instead of an HDD. Using less RAM saves the energy that would be used to manufacture the RAM, along with the energy that would be used to manufacture a new computer if an application requires more RAM than the computer has slots for.

Re:Making a new computer costs energy (1)

Opportunist (166417) | more than 2 years ago | (#37907662)

Sorry, won't fly.

Using SSD instead of HDD would not fly for the obvious reason I gave in the entry of my first post: Space in Ram or HD will be filled by programs. There are plenty of SSDs today that have more than enough room to hold OS, office and data files. Saying that the HD footprint of a program keeps you from buying a SSD is like saying that the size of your computer keeps you from buying a nicer, but smaller, apartment closer to the city. It's not the program that keeps you from buying a SSD. It's the fact that you don't want to delete some crap or move it to an external device.

And the manufacture of Ram is independent of the needs. You might have noticed that 512MB sticks come in the same (physical) size as those fat 4gig ones.

The only reason I could see is having a new OS and a new office suite needing a new computer to run because the old one can't "lift" it anymore. In that case, though, I question your need for the new office suite. What does it offer that the old one doesn't and that you actually need?

Re:This just in (0)

Anonymous Coward | more than 2 years ago | (#37907784)

Not necessarily less RAM and less HD space, but less processing power.

Busy loops will eat more wattage than smarter event-triggered wait states.

QED.

Re:This just in (1)

Kozar_The_Malignant (738483) | more than 2 years ago | (#37907594)

Why bother with optimization?

Energy consumption.

Speed

Re:This just in (1)

tepples (727027) | more than 2 years ago | (#37907346)

Today, you can waste processing time on animating cursors and create colorful file copy animations. Why bother with optimization?

Because some people have netbooks with Atom CPUs, and some people still use really old PCs because they're paid for. Clock-for-clock, a 1.6 GHz Atom is roughly comparable to a 1.6 GHz P4.

Re:This just in (1)

Opportunist (166417) | more than 2 years ago | (#37907540)

Even 1.6GHz single core is already way more than you could possibly need for pretty much any office application. We outgrew the office needs a long, long while ago. My guess would be somewhere around the P2/P3 era.

We need "faster" to run more crap, that's pretty much it. And yes, running a bloated OS on that atom clock will not result in satisfactory performance. It's not the program that needs more ram or more HD space that's the problem here, though. It's not even the program's CPU hunger. It's the many, many little programs that litter your ram, from drivers for god knows what kind of freaky hardware you rarely if ever use, to "enhancements", and the omnipresent crud you get for free with every new Netbook.

But calling for better optimized programs to make up for that shit is like calling for the roadmen to be more quiet with the jackhammer 'cause you can't hear your friends yakking. Instead of slapping the one that actually does some work, cut the crap that doesn't.

Uninstalling helps but isn't perfect (1)

tepples (727027) | more than 2 years ago | (#37907678)

It's the many, many little programs that litter your ram, from drivers for god knows what kind of freaky hardware you rarely if ever use, to "enhancements", and the omnipresent crud you get for free with every new Netbook.

Uninstalling unused programs helps, and it's the first thing I try when family members tell me a computer is running slow. But the fact remains that some programs won't run acceptably on a netbook or an old P4 PC even after I've uninstalled all that shit. I've seen applications fail to run because their forms are laid out for a monitor at least 768 pixels tall. I've seen Adobe Flash fail to keep up on Flash videos, and I've seen Firefox fail to keep up on HTML5 videos. I've seen native games fail to run because they aren't meant to scale down to the Dreamcast-class capability of an Intel GMA.

Re:This just in (1)

Nemyst (1383049) | more than 2 years ago | (#37907608)

Old PCs should run with software from the same era. You can't expect today's software to run on hardware from 10 years ago, or else why stop there? Why not ask for Windows 7 to run on a 486? OSX on an Apple II?

As for Atoms, they have no problem running Windows 7 or the latest Linux distros.

Re:This just in (0)

Anonymous Coward | more than 2 years ago | (#37907658)

I think I figured out why Microsofts OSes are always too large for current machines. They have time travel capability.
They must send their OSes back from the future on machines much more capable than ours are.
It explains everything. Including Bills continuing youthfulness. Damned immortality cream.

I hated Turbo Pascal (3, Interesting)

Baldrake (776287) | more than 2 years ago | (#37907154)

In my first job, I was responsible for developing a programming environment for a Pascal-like language that included a visual editor, interpreter and debugger. I remember my boss showing up in my office and showing me an ad he had cooked up, with big, bold lettering saying "Runs in 256 kB!"

As a young developer, it was one of the tougher moments in my life to admit that we were going to need a full 512 kB.

It was difficult living in a world where Turbo Pascal ran comfortably on a 64 kB machine.

Coincidence (1)

sl4shd0rk (755837) | more than 2 years ago | (#37907194)

39,731 is the exact number of milliseconds it takes to lose interest in an Apple fanboi blog.

Quickly (-1)

Anonymous Coward | more than 2 years ago | (#37907200)

Name three things that Jackie Chan is smaller than.

What is it not Smaller Than? (-1)

Anonymous Coward | more than 2 years ago | (#37907204)

Kim Kardashian's Marriage

Pictures are large (1)

Random2 (1412773) | more than 2 years ago | (#37907252)

Yes, it does stand to reason that something from 1986 is smaller than IMAGE files (yahoo's homepage, wiki's C++ page, PDFs, etc). That 256-bit color depth means we need 256 bits to define the colors after all.

Instead of comparing to PDFs, image files, and things written 30 years later, why not compare to contemporaries?

I would be tempted to call this s slashadvertisement, but they're not even advertising something. Where's the 'pointlesslyIdle' tag when we need it?

Re:Pictures are large (1)

DerPflanz (525793) | more than 2 years ago | (#37907510)

Instead of comparing to PDFs, image files, and things written 30 years later, why not compare to contemporaries?

He also compares it to the 'touch' command. Not even close in being an editor, let alone compiler as well. And still bigger.

Re:Pictures are large (1)

Infiniti2000 (1720222) | more than 2 years ago | (#37907702)

It's not necessarily pointlessly idle, it's nostalgic. And, being nostalgic, comparing it to current data is appropriate. It's interesting to compare an application 25 years ago with today's data (be it applications, documents, or images). Those of you who didn't actually use Turbo Pascal (I did) will have similar bouts of nostalgia 20+ years from now about forgotten languages from today.

Sidekick (1)

gmuslera (3436) | more than 2 years ago | (#37907304)

Loved TP back in the days, but marvelous as it was, was nothing compared with the miracle in few bytes that was Sidekick for DOS. Depending on how much things had into, was between 20 and 50k.

That brings me back (1)

dlhm (739554) | more than 2 years ago | (#37907316)

My second language was Turbo Pascal, after BASIC. I loved that language.. I was 12, trying to create games.. 320x200x256 raw graphics, stitched together with ASM. I actually created a photoshop type program to edited my "proprietary" raw image files.( I just added a larger header to regular raw graphics files) all in DOS with a Hyundai Super 286 AT 12Mhz 40 meg HD, 4 meg of ram and a 2400 baud modem.

that chart (5, Funny)

blaisethom (1563331) | more than 2 years ago | (#37907362)

At 88,225 bytes, the image showing the comparison is also bigger. Oh the irony

My favorite small program (1)

lfp98 (740073) | more than 2 years ago | (#37907400)

ImageQuant was a c. 1986 scientific Windows 3.1 program for quantitative analysis of 2D images. It displayed 16-bit images in either grayscale or false color. Draw a box around any object and it could integrate the intensity within, subtracting the average background of the perimeter. You could draw a long box of any width and it would display the integrated intensities as a line graph, then you could graphically mark off each peak and it would integrate the intensities, with multiple choices for setting the baseline. Select any area of the graph, and it would expand to fill the window. It could also rotate the image. Not sure of the exact size, but it fit on a single floppy.

Derive symbolic math app - single floppy (1)

mlosh (18885) | more than 2 years ago | (#37907436)

Derive [wikipedia.org] could do a lot of the symbolic math that Macsyma does as a single 5 1/4" floppy (360K?) DOS app in 1988. Awkward interface but useful for grad. physics.

TI-89 (1)

tepples (727027) | more than 2 years ago | (#37907696)

And because of its small size, Derive is still in use on TI-89 calculators.

But what did it cost? (0)

Anonymous Coward | more than 2 years ago | (#37907498)

Back then, according to some random website I found (http://www.jcmit.com/memoryprice.htm), in Sep 1986, memory cost $190/MB meaning that "little executable" cost $7.55. Adjusted for inflation (http://dollartimes.com), that equates to $15.14 now. May not seem like alot, but that same $15.14 can now get you a 1GB stick of DDR3 RAM, which is way faster and probably orders of magnitude more RAM than any machine had back then.

And what did that 30k piece of software get you anyway? Probably not much useful...

Re:But what did it cost? (1)

JoeMerchant (803320) | more than 2 years ago | (#37907818)

Useful has to be taken in context - even if TP is lame by today's standards, it was state of the art for its time - gave you a competitive edge when you used it.

Was a time when a hand-axe was a prized tool because poorly sized or shaped rocks just couldn't get the job done as quickly.

39 kb? Try 4! (0)

Anonymous Coward | more than 2 years ago | (#37907502)

And it's over 10 times as big as some of the graphical demos some people make nowadays!
For those who think programmers can't optimise anymore: check out the following video captures and try to remember that the programs that generated these video's are only 4 KB in size:
CDAK: http://www.youtube.com/watch?v=cjSJc2eCetE
Elevated: http://www.youtube.com/watch?v=_YWMGuh15nE

And a personal favorite of mine (although a massive 96 kb in size):
Debris: http://www.youtube.com/watch?v=wqu_IpkOYBg

Multimedia is always the biggest (0)

Anonymous Coward | more than 2 years ago | (#37907582)

If you strip all of the video/audio/image/UI data out of most of these things, and compressed and/or compiled down to bytecode the web pages shown, it wouldn't be all that shockingly different.

Easy XOR Fast it's exclusive OR (1)

Anonymous Coward | more than 2 years ago | (#37907590)

Software quality has decreased due to programming languages that "make the programmers life easier" by "doing the hard things" for them.

-Memory allocation is hard, let java do it for you, at a price

-String comparison is hard, let java do it for you, at a price

-Graphics is hard, let java do it for you, at a price.

Java, it's thousands of "helper" libraries, classes, etc, all cause needless bloat.

Memory is "free" cpu cycles are "free and fast"..why bother choosing an appropriate sort/data model, simply let java/library X do it for you, at a small price.

Hint: lots of "small prices" quickly add up.

Turbo Pascal Rocked (1)

Howard Beale (92386) | more than 2 years ago | (#37907798)

I remember using Turbo Pascal for my AI class - needed to write an Othello program. Originally wrote it on the CP/M version of Turbo Pascal, on a Coleco Adam - yes, a Coleco Adam. With dual cassette drives, dual disk drives and a parallel printer interface. Ported it over for final polish on the Zenith desktop PCs in the college lab. Professor gave me an A, couldn't beat the program.

"Doesn't this violate the laws of thermodynamics?" (1)

bfwebster (90513) | more than 2 years ago | (#37907836)

That's what my friend and boss Wayne Holder said about Turbo Pascal when I demo'd it for him back when it first came out in the early 80s. It wasn't just that TP was vastly smaller than any other Pascal (C, FORTRAN, etc.) compiler out there, it's that it compiled much, much faster -- in some cases, an order or two of magnitude faster. ..bruce..

Despite being Pascal, it was tight. (4, Interesting)

Anonymous Coward | more than 2 years ago | (#37907880)

Turbo Pascal was pretty sweet, even though it came from Borland, and even if it was Pascal. It could compile 5,000 lines of code in the blink of the eye. Embedding assembly into it? No problem. It didn't care. The editor was supreme as well. Even when I stopped using TP, I still used the editor every day for a decade after the fact because it could do absolutely everything.

I'm not sure where all the hating is coming from, because TP did not generate hugely bloated executables. The only problem with it was that it eventually was discontinued, so special hacks like paspatch were required to patch TP compiled executables on the P II and higher to allow them to run.

It was actually closer to 512K with all of its dependencies, but it was damn fine.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?