Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming Software

Things That Turbo Pascal Is Smaller Than 487

theodp writes "James Hague has compiled a short list of things that the circa-1986 Turbo Pascal 3 for MS-DOS is smaller than (chart). For starters, at 39,731 bytes, the entire Turbo Pascal 3.02 executable (compiler and IDE) makes it less than 1/4th the size of the image of the white iPhone 4S at apple.com (190,157 bytes), and less than 1/5th the size of the yahoo.com home page (219,583 bytes). Speaking of slim-and-trim software, Visicalc, the granddaddy of all spreadsheet software which celebrated its 32nd birthday this year, weighed in at a mere 29K."
This discussion has been archived. No new comments can be posted.

Things That Turbo Pascal Is Smaller Than

Comments Filter:
  • Killer App (Score:3, Interesting)

    by GalacticOvergrow ( 1926998 ) on Tuesday November 01, 2011 @10:12AM (#37907054)
    Visicalc was the first killer app as well. I remember people coming into the store and asking for Visicalc computers not knowing it was a program that ran on an Apple II.
    • You could fit Visicalc into the codespace of an Arduino. Good luck fitting much of a spreadsheet and the systems variables into 2K of RAM though.

    • Thankfully it's still covered by copyright, or the whole market would implode!

  • by frodo from middle ea ( 602941 ) on Tuesday November 01, 2011 @10:14AM (#37907084) Homepage
    When I first started learning Pascal, I had a real hard time wrapping my head around the concept of a Pointer, and struggled with pointers and related concepts like linked-lists etc. throught out the time I had to use Pascal. I was using the Schaum's book on Pascal.

    But under 'C' it felt somehow very natural and easy to understand, never had a problem with 'C' pointers, and data structures. R.I.P. DMR, your book really opened my eyes to the wonderful world of computing.

    • Really? I felt it was the other way around: pointers on Pascal felt intuitive, while pointers in C made my programs more prone to failure.

    • Re:Pascal v/s C (Score:4, Interesting)

      by satuon ( 1822492 ) on Tuesday November 01, 2011 @10:33AM (#37907418)

      It was the same with me. I learned Turbo Pascal and knew about pointers, but only when I switched to C I realized that pointers are numbers, like indexes in an array.

      There were a lot of things that were easier to understand in C than in Pascal. For example scanf and printf were just library functions, while in Pascal readln and writeln were parts of the language. Also, what "#include " did was perfectly clear - a simple text substitution, i.e. the same as if I had gone the header and copy-pasted its contents in the .c file, while in Pascal when you write "uses crt;" I wasn't sure what actually happens. The fact that text was an array of numbers was not clear to me while I was using pascal, what with all the Chr and Ord function to move from Character to Integer, and strings were part of the language and were like blackboxes.

      • by tibit ( 1762298 )

        Text is not an array of numbers. That's why a lot of applications required heavy-handed porting to unicode. Text is an array of characters. I don't routinely need Chr nor Ord (nor their equivalents in any other language). Their use is pretty much limited to a somewhat hacky implementation of base conversion for I/O, and even there it's a safer bet to use a lookup array unless you really need to optimize memory use.

        If you have a string and want ASCII output (or input), you use a codec to do it. Any non-trivi

    • That's because you learned what pointers were in Pascal first. I did that too, Pascal first, then C. C made sense because I knew what pointers were.

      --PM

    • Comment removed based on user account deletion
  • Bytes? (Score:5, Funny)

    by Applekid ( 993327 ) on Tuesday November 01, 2011 @10:16AM (#37907118)

    Bytes are kind of weird. Can't they give these number in terms of Library of Congress?

    • by skids ( 119237 )

      [deadpan]
      I'm confused. They say the size was 24K. What's a K? Is that a typo, shouldn't it be M?
      [/deadpan]

    • by Barryke ( 772876 )

      I don't know how many Lines Of Crap the library of congress has, does anyone know?

  • So what? (Score:2, Insightful)

    by Jeff Hornby ( 211519 )

    First thing that comes to mind is: so what? This whole argument that smaller is better is crap. The reason that software is bigger these days is that it does more for you. How productive was the GUI for Turbo Pascal (it sucked), how good were the other tools that came with it (nonexistent), how fast were the release cycles (about the same as today). So with what people call bloat we get better tools that make us more productive thereby driving down the cost of software development.

    Or to put it another w

    • Chill. It's a typical /. entry about how grass was greener (I testify, by the way, being a relative geezer).

    • by msauve ( 701917 )
      "The reason that software is bigger these days is that it does more for you. "

      "touch" does more than a Pascal compiler?
    • by Spad ( 470073 )

      Or to put it another way: if you really don't like bloat, when are you going to trade in your car and start driving to work in a hot wheels?

      I think a more accurate comparison might be to ask why you need a Hummer to take your two kids to school when you can do it equally well in a Ford Focus which is half the size and probably 10 times as fuel efficient?

      The reality is that people don't optimize software because they generally don't have to outside of embedded programming. Even software for iOS and Android can be horrendously bloated in terms of the resources it uses. Interestingly, it's console hardware limitations that have been leading to bet

      • I think a more accurate comparison might be to ask why you need a Hummer to take your two kids to school when you can do it equally well in a Ford Focus which is half the size and probably 10 times as fuel efficient?

        If there are in fact situations where one needs a Hummer or similar SUV, is it cheaper to fuel an SUV than to buy an additional small passenger car for trips that don't need the SUV?

        • by mark-t ( 151149 )
          But, in general, unless people live in a rural area, they still do not really need a large vehicle most of the time, and when they do, such periods are usually no longer than a weekend, with the possible exception of a single period of maybe a couple of weeks sometime during an entire year The question should then become whether it is cheaper to always be fuelling an SUV as a regular mode of transport than it is to periodically rent one for the times when actually you need it?
    • Sorry, your post went a little south with the Vista debacle. Because MS had to fast track a recovery of a whole new architecture, the result was not optimized at all ... and netbooks got crushed. Windows 7 is more sensible because they did have time to snip out a lot of the junk code.

      Currently we're disparaging the need for tight code, but give it one skipped cycle of Moore's law and suddenly the software side will have to take up the slack. Currently it's the mobile phones with their weaker processors that

      • by tepples ( 727027 )

        Currently it's the mobile phones with their weaker processors that are preventing "dock your phone into a workstation shell" from being the universal desktop in your pocket.

        Are you sure it's just the processor, or is it Apple's restrictions on what gets accepted into its App Store?

    • Intellisense/autocompletion is the only IDE tool that is vaguely useful (though vastly overrated). You can use vim, or hell, notepad, to do all your coding, for real projects, along with gcc and make.

      Code is/can be compact (that's why demos can be so small) and even in today's world, there's still value in knowing how yo code compactly.

    • The reason that software is bigger these days is that it does more for you.

      It depends on whether I actually want that "more". For example, I use mpc-hc for playing music files because I only want a media player, not a media player+CD ripper+MP3 encoder+CD writer+music store+something else. So, for me, iTunes are bloated because the software has all those functions I do not usually need. When I need them, I can start another small program to do that function.

    • >The reason that software is bigger these days is that it does more for you.

      Like Clippy?

    • by gr8_phk ( 621180 )

      First thing that comes to mind is: so what? This whole argument that smaller is better is crap. The reason that software is bigger these days is that it does more for you.

      Smaller is usually better for a given functionality. Your point that software does more is valid too.

      To interject with my own babble, why does a composited desktop require OpenGL? That stuff can be done with a few K of highly optimized C or ASM code. Another example of bloat in the name of easy to implement - never mind the added depen

    • The reason that software is bigger these days is that it does more for you.

      That doesn't explain everything. In fact, it doesn't even explain very much. The only thing on that chart that does more than Turbo Pascal is the Erlang parser: the rest either deals with simple Unix commands or libraries, or is simple documentation (which doesn't do anything by itself).

      Or to put it another way: if you really don't like bloat, when are you going to trade in your car and start driving to work in a hot wheels?

      The Hot Wheels car doesn't do what I want. De-bloating software means cutting out unnecessary crap, not necessary things.

  • Sure, these programs were small, but try to keep in perspective that they were leveraging the OS to get their compactness. It's kind of like saying a "Hello, World" GUI app is only 3 lines of code.... sure, 3 lines, plus 35 megs of library files running atop 1 gig of OS support. The .exe may only be 2k, but good luck getting that to do anything without serious support.

    • by Alioth ( 221270 )

      In the case of Turbo Pascal, probably pretty much everything was in that .exe. Don't forget there was no dynamic linking of libraries, let alone much in the way of libraries on MS-DOS in the first place. There would have been various int# calls within the executable to call the "operating system" but the OS (BIOS + DOS) itself wasn't all that much bigger.

      Certainly in the case of the likes of spreadsheet programs for 8 bit systems, at most you probably had a 16K ROM in addition to the program itself, so stil

      • Even DOS provided a lot of high level file, screen and keyboard I/O functionality.

        Yes, DOS itself wasn't that big, and, hey, I've implemented entire 8-bit systems with significant RS232 based I/O on 16K PROMs with no OS, so, yes, it can be done smaller.

        I just get tweaked when somebody calls out "look ma, it's only 4K" and they're sitting on top of some ginormous library that's doing everything for them.

      • by Latent Heat ( 558884 ) on Tuesday November 01, 2011 @11:04AM (#37907974)
        The story was that not only did that 39K COM file image (Y'all remember the .EXE/.COM file distinction and if you went with a .COM file it had to shoehorn into 64K or you had to revert to overlays -- ick! -- or other tricks? Y'all remember DOS memory models? Or am I, like, really old?) contain the whole works -- editor, compiler, run-time library -- the story was that it was Yet Another Pascal Compiler Compiled Using Itself.

        My suspicion is/was that the RTL (run-time library) was hand-coded in assembly language and from .COM file sizes of stuff compiled with Turbo Pascal 3.0 that RTL ran maybe about 10-12K. That is, the Turbo Pascal image had the hand-coded RTL in the first 12 K of the image and the rest -- editor and Pascal compiler -- were written and compiled in Turbo Pascal and occupied the rest, which was about the size/scale of a simple editor and a Pascal compiler based on the complexity of source codes for those things that were "around." The cool thing, especially on dual floppy disk PCs, was that the 39K was everything, no overlays, no nothing else. The 12K RTL got plopped into the COM file compiled from your source codes.

        The thing about it is that yeah, yeah, you had the limitations of Pascal, the Small memory model, 64K data segment, and Borland didn't even get the 8087 math coprocessor support right (inline instead of high-overhead function calls to a math library) until Turbo 4, which wasn't anywhere as kewl as Turbo 3 from the standpoint of compactness. But you could develop useful apps with this thing on a dual-floppy machine.

        The other thing about this is the Pascal language. I had a conversation with a dude who was selling some 3rd party library for the Turbo Pascal ecosystem who expressed the view that hate the begin-end, hate the quirky use of semicolon as a statement "separator" instead of 'terminator", hate the bondage-and-discipline aspects (although the Turbo dialect of Pascal solved the fixed-length string problem and gave you enough overrides to the Pascal type safety to allow it to do anything C can), Pascal is the Ur Single-Pass Compiler language. I guess the Arch language of simple parsing at the expense of stupid looking source would be Lisp, but Pascal was close behind in terms of simple syntax and simple compiler implementations. Back in the day before we had Cray Y-MPs on our desk as we effectively do today, that compilation of large programs in the time of a sneeze instead of a long coffee break was a huge, huge productivity booster that made up for whatever people hated about Pascal.

        So ol Nicky Wirth was a smart dude when he invented Pascal, and Anders Hejlsberg (Philippe Kahn was just the front man) was also a smart hacker in coming up with Turbo 3, and you have to give the man his propers in hackerdom. For what it is worth, Hejlsberg crossed over to the Dark Side and is credited as the Chief Architect behind the abortive Microsoft Java ecosystem J-somethingoranother from which came the good Visual Studio versions, C#, and all of that.

      • Turbo Pascal 5.5 had overlay file support, that was something like a dynamic linking library. Don't know whether older versions could do that, never used them.

    • by tepples ( 727027 )
      OK, then let's try it on a Game Boy Advance, which has a 16 KiB BIOS. I wrote a simple terminal emulator plus font plus Hello World in C (or in C++ using only C features), and it compiled to less than 6000 bytes with no BIOS calls, and that's even without paying attention to optimization for space. But when I did the same using C++ <iostream> of GNU libstdc++, the file bloated to over 180,000 bytes.
    • "they were leveraging the OS to get their compactness"

      You do realize 1986 Turbo Pascal ran on DOS, right? Back when developers wrote directly to the monitor and disk drives? When we had to have an intimate relationship with the stack? When there were no threads and one process owned the entire CPU?

      Leverage that, young whippersnapper, and get off my lawn.

      • DOS did more than people realize. I'm not saying Gates was a genius or anything, but there's a reason he got a lock-in, if DOS was truly useless, people would have re-coded their own drivers.

        I seem to remember purchasing a third party graphics library in the early '90s, MetaWindows or something like that, it gave me a access to accelerated graphics that DOS wouldn't, without having to re-code for every graphics card on the market. But, DOS still took care of the file system and keyboard (I think MetaWindo

    • by mark-t ( 151149 )
      Turbo Pascal leveraged the OS primarily only for doing things like necessary I/O. Everything else, including the implementation of the basic user interface and all of the controls, was completely self-contained in that exe.
  • This just in (Score:5, Insightful)

    by Opportunist ( 166417 ) on Tuesday November 01, 2011 @10:18AM (#37907152)

    Software grows to fill the available ram.

    Code is always a tradeoff between codesize, development time and ram needed for execution. I'm fairly sure you can optimize code today to a point that would put those programs (which were optimized 'til they squeaked to squeeze out that last bit of performance) to shame, but why? What for? 30 years ago, needing a kilobyte of ram less was the make or break question. When drivers weighed in the 10kb range and you still calculated which ones you absolutely need to load for the programs you plan to run, where you turned off anything and everything to get those extra 2 kb to make the program run. Today, needing a few megabytes of ram more is no serious issue. And mostly because it just really doesn't matter anymore. Do you care whether that driver, that program, that tool needs a megabyte more to run? Do you cancel it because it does? No. Because it just doesn't matter.

    We passed the point where "normal" people care about execution speed a while ago. Does it matter whether your spreadsheet needs 2 milliseconds longer to calculate the results? I mean, instead of 0.2 you now need 0.202 seconds, do you notice? Do you care? Today, you can waste processing time on animating cursors and create colorful file copy animations. Why bother with optimization?

    Because, and that's the key here, optimizing code takes time. And that costs money. Why should anyone optimize code if there's really no need for it anymore? And it's not the "lazy programmers" or the studios that don't care about the customers. The customers don't care! And with good reason they don't. They do care about the program being delivered on time and for a reasonable price, but they don't care whether it needs a meg more of ram. Because it just friggin' doesn't matter anymore!

    So yes, yes, programs back in the good ol' days were so much better organized and they used ram so much better, they had so much niftier tricks to conserve ram and processing time, but in the end, it just doesn't matter anymore today. You have more ram and processing power than your ordinary office program could ever use. Why bother with optimization?

    • by tepples ( 727027 )

      Today, you can waste processing time on animating cursors and create colorful file copy animations. Why bother with optimization?

      Because some people have netbooks with Atom CPUs, and some people still use really old PCs because they're paid for. Clock-for-clock, a 1.6 GHz Atom is roughly comparable to a 1.6 GHz P4.

      • Even 1.6GHz single core is already way more than you could possibly need for pretty much any office application. We outgrew the office needs a long, long while ago. My guess would be somewhere around the P2/P3 era.

        We need "faster" to run more crap, that's pretty much it. And yes, running a bloated OS on that atom clock will not result in satisfactory performance. It's not the program that needs more ram or more HD space that's the problem here, though. It's not even the program's CPU hunger. It's the many,

        • It's the many, many little programs that litter your ram, from drivers for god knows what kind of freaky hardware you rarely if ever use, to "enhancements", and the omnipresent crud you get for free with every new Netbook.

          Uninstalling unused programs helps, and it's the first thing I try when family members tell me a computer is running slow. But the fact remains that some programs won't run acceptably on a netbook or an old P4 PC even after I've uninstalled all that shit. I've seen applications fail to run because their forms are laid out for a monitor at least 768 pixels tall. I've seen Adobe Flash fail to keep up on Flash videos, and I've seen Firefox fail to keep up on HTML5 videos. I've seen native games fail to run bec

      • by Nemyst ( 1383049 )

        Old PCs should run with software from the same era. You can't expect today's software to run on hardware from 10 years ago, or else why stop there? Why not ask for Windows 7 to run on a 486? OSX on an Apple II?

        As for Atoms, they have no problem running Windows 7 or the latest Linux distros.

  • I hated Turbo Pascal (Score:4, Interesting)

    by Baldrake ( 776287 ) on Tuesday November 01, 2011 @10:18AM (#37907154)

    In my first job, I was responsible for developing a programming environment for a Pascal-like language that included a visual editor, interpreter and debugger. I remember my boss showing up in my office and showing me an ad he had cooked up, with big, bold lettering saying "Runs in 256 kB!"

    As a young developer, it was one of the tougher moments in my life to admit that we were going to need a full 512 kB.

    It was difficult living in a world where Turbo Pascal ran comfortably on a 64 kB machine.

  • 39,731 is the exact number of milliseconds it takes to lose interest in an Apple fanboi blog.

  • Yes, it does stand to reason that something from 1986 is smaller than IMAGE files (yahoo's homepage, wiki's C++ page, PDFs, etc). That 256-bit color depth means we need 256 bits to define the colors after all.

    Instead of comparing to PDFs, image files, and things written 30 years later, why not compare to contemporaries?

    I would be tempted to call this s slashadvertisement, but they're not even advertising something. Where's the 'pointlesslyIdle' tag when we need it?

    • Instead of comparing to PDFs, image files, and things written 30 years later, why not compare to contemporaries?

      He also compares it to the 'touch' command. Not even close in being an editor, let alone compiler as well. And still bigger.

    • It's not necessarily pointlessly idle, it's nostalgic. And, being nostalgic, comparing it to current data is appropriate. It's interesting to compare an application 25 years ago with today's data (be it applications, documents, or images). Those of you who didn't actually use Turbo Pascal (I did) will have similar bouts of nostalgia 20+ years from now about forgotten languages from today.
  • Loved TP back in the days, but marvelous as it was, was nothing compared with the miracle in few bytes that was Sidekick for DOS. Depending on how much things had into, was between 20 and 50k.
  • that chart (Score:5, Funny)

    by blaisethom ( 1563331 ) on Tuesday November 01, 2011 @10:30AM (#37907362)

    At 88,225 bytes, the image showing the comparison is also bigger. Oh the irony

  • I remember using Turbo Pascal for my AI class - needed to write an Othello program. Originally wrote it on the CP/M version of Turbo Pascal, on a Coleco Adam - yes, a Coleco Adam. With dual cassette drives, dual disk drives and a parallel printer interface. Ported it over for final polish on the Zenith desktop PCs in the college lab. Professor gave me an A, couldn't beat the program.
  • That's what my friend and boss Wayne Holder said about Turbo Pascal when I demo'd it for him back when it first came out in the early 80s. It wasn't just that TP was vastly smaller than any other Pascal (C, FORTRAN, etc.) compiler out there, it's that it compiled much, much faster -- in some cases, an order or two of magnitude faster. ..bruce..

  • by Anonymous Coward on Tuesday November 01, 2011 @10:57AM (#37907880)

    Turbo Pascal was pretty sweet, even though it came from Borland, and even if it was Pascal. It could compile 5,000 lines of code in the blink of the eye. Embedding assembly into it? No problem. It didn't care. The editor was supreme as well. Even when I stopped using TP, I still used the editor every day for a decade after the fact because it could do absolutely everything.

    I'm not sure where all the hating is coming from, because TP did not generate hugely bloated executables. The only problem with it was that it eventually was discontinued, so special hacks like paspatch were required to patch TP compiled executables on the P II and higher to allow them to run.

    It was actually closer to 512K with all of its dependencies, but it was damn fine.

One man's constant is another man's variable. -- A.J. Perlis

Working...