×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Chuck Moore Holds Forth

timothy posted more than 12 years ago | from the lots-of-little-processors dept.

Programming 211

A little while ago you asked Forth (and now colorForth) originator Chuck Moore about his languages, the multi-core chips he's been designing, and the future of computer languages -- now he's gotten back with answers well worth reading, from how to allocate computing resources on chips and in programs, to what sort of (color) vision it takes to program effectively. Thanks, Chuck!

FFP, Combinator Calculus and Parallel Forth
by Baldrson

In his 1977 Turing Lecture, John Backus challenged computists to break free of what he called "the von Neumann bottleneck". One of the offshoots of that challenge was work on massive parallelism based on combinator calculus a branch of mathematics that is far closer to Forth's formalism than parameter list systems (which are more or less lambda calculus derivatives).

The prolific Forth afficionado Philip Koopman did some work on combinator reduction related to Forth but seems not to have followed through with implementations that realize the potential for massive parallelism that were pursued in the early 1980s by adherents of Backus's Formal Functional Programming paradigm. Given recent advances in hierarchical grammar compression algorithms, such as SEQUITUR, that are one step away from producing combinator programs as their output, and your own statements that Forth programming consists largely of compressing idiomatic sequences, it seems Backus's original challenge to create massively parallel Formal Functional Programming machines in hardware are near realization with your new chips -- lacking only some mapping of the early work on combinator reduction machines.

It is almost certainly the case you are aware of the relationship between combinator reduction machines and Forth machines -- and of Backus's challenge. What have you been doing toward the end of unifying these two branches of endeavor so that the software engineering advantages sought by Backus are actualized by Forth machines of your recent designs?

Chuck Moore: What can I say? Backus did not mention Forth in his lecture. He probably didn't know of it then. Yet Forth addresses many of his criticisms of conventional languages.

He thinks a language needs or benefits from a formal specification. I grew up worshiping Principia Mathematica 'till I learned how Goedel refuted it. The result is that I distrust formal representations. For example, the ANSII Forth standard does not describe Forth, but a language with the same name.

Yes, I am struck by the duality between Lisp and Lambda Calculus vs. Forth and postfix. But I am not impressed by the productivity of functional languages. Even as research tools, they have failed to live up to their promise. By that I mean to do something with computers that I couldn't do more easily in Forth.

I designed the memory for the c18 to occupy the same area as the processor. This means small, fast and smart. c18 can respond to a bus request by fetching from its memory, accessing off-chip or performing a calculation. The 25x avoids the von Neumann bottleneck by making up to 27 memory accesses at the same time (2 off-chip). And its multiple buses do not substitute a network bottleneck for a memory one.

Standard code will be in the ROM of each computer. How this is customized in RAM and the computers assigned tasks is left to the ingenuity of the programmer, not a compiler. Automatically generated or factored code has never impressed me. Nor has automatic place and route for circuit boards or silicon. They are both an order-of-magnitude from human performance. Because humans understand the problem, judge the results and cheat as required.

Marginalizing of the blind
by Medievalist

When I built my first Internet node, the web did not yet exist, and one of the amazing things about the Internet was how friendly it was to the blind.

Now, with some computer experts estimating that over 50% of the Internet is incomprehensible to braille interfaces, and most computer operating systems devolving to caveman interfaces ("point at the pretty pictures and grunt") we seem to be ready to take the next step - disenfranchising the merely color-blind.

I realize that colorforth is not inherently discriminatory, in that there are a great many other languages that can be used to do the same work. The web is also not inherently discriminatory, because it does not force site designers to design pages as stupidly as, for example, Hewlett-Packard.

Would you care to comment on the situation, speaking as a tool designer? How would you feel if a talented programmer were unable to get a job due to a requirement for colored sight?

CM: I'm amazed at how effective blind programmers can be. I rely so strongly upon seeing the code that it's hard to imagine listening to it. Yet I know it can be done. Not being color-blind, it's hard to appreciate the degree of information loss. But it's less than being blind.

My goal is to develop tools that augment my abilities. If others can use them, fine. It would be foolish to lose an opportunity to explore or excel just to conform to some equalitarian philosophy. Too often our culture seeks the lowest common denominator.

20-20 vision is required for fighter pilots. I have no qualms about requiring color vision for programmers. Everyone does not need to be a programmer.

But in fact, color is merely a property of words that helps to distinguish them. As is intensity, size, font, volume and tone. I'm sure colorForth will be translated into these other representations. I, myself, will be exploring spoken colorForth. (As soon as I can decipher PC sound cards.)

Massively Parallel Computing
by PureFiction

The 25X system reminded me of IBM's Blue Gene computer, where a large number of inexpensive CPU cores are placed on a single chip.

The biggest problem in dealing with a large number of small cores lies in the programming. I.e. how do you design and code a program that can utilize a thousand cores efficiently for some kind of operation? This goes beyond multi-threading into an entirely different kind of program organization and execution.

Do you see Forth (or future extensions to Forth) as a solution to this kind of problem? Does 25X dream of scaling to the magnitude that IBM envisions for Blue Gene? Do you think massively parallel computing with inexpensive, expendable cores clustered on cheap dies will hit the desktop or power-user market, or forever be constrained to research?

CM: Forth is a massively pragmatic language: do whatever you can to solve a problem. Its strength is in the ease of violating whatever rules it has. The 25x is similarly pragmatic. I don't know how to program it yet, but I'm confident I can. It's just another level of factoring.

The parallelism provided by the 25x has a different slant from other parallel architectures. The computers are not identical. I expect many will have different ROM and different interface to the real world. This asymmetry is a powerful clue as to how applications will be factored.

A 10x10 array of 25x chips is an easy board to build. At 50 Watts, it needs as much power as a notebook. That's 2500 computers providing 6M Mips. I can't imagine programming them any other way than Forth.

The advantage of Forth in this kind of context is that it scales. Forth is the machine language, Forth is the high-level language, Forth is the task-control language, Forth is the supervisory language. Each of these has a different vocabulary, but they share syntax, compiler and programmer skills.

Back to the array of 25x chips. Each chip could be on a vertical and horizontal serial bus with 10 others. A half-duplex bus requires a computer to manage, so that accounts for 200 computers. Now whatever the application, data must be provided. Say 1GHz Ethernet. Data (and program) is received, distributed and crunched. The assignment and coding of computers follows the data flow. Results are routed back to Ethernet, or displayed or whatever. It's a nice programming problem, well within the ability of a human to organize.

Will this ever reach the mass market? I don't know.

The direction of 25x Microcomputer...
by Midnight Ryder

The 25x concept looks like it could really a damned interesting idea. But one of the questions in my mind is where you want to head with it? Is this something that is to be used for very specialized research and scientific applications, or is this something that you envision for a general 'desktop' computer for normal people eventually?

Secondly, if you are considering the 25x for a desktop machine that would be accessible by people that aren't full-time geeks, what about software? Forth is a lost development art for many people (It's probably been 10 years since I even looked at any Forth code) and porting current C and C++ application would be impossible - or would it? Is there a potential way to minimize the 'pain' of completely re-writing a C++ app to colorForth for the 25x machines, which could help to speed adoption of a platform?

CM: At this stage the 25x is a solution looking for a problem. It's an infinite supply of free Mips. There's no obligation to use them all, or even very many. But they can effectively be used to eliminate hardware. To bit-bang what would otherwise need a controller. So if you want video or audio or radio or ...

The first applications will doubtless be embedded. These offer greater volume, less software and less market resistance than a general-purpose computer. I see 25x reaching the desktop as dedicated appliances rather than universal golems.

I'm not interested in recoding C applications. My experience indicates that most applications are hardware-dependent. The 25x is as large a change in the hardware environment as I can imagine. This changes the program so much it might as well be rethought and recoded. The most efficient way to do that is Forth.

Forth is a simple, interactive language. Its learning curve is steep with a long tail. You can be productive in a day/week. This depends only on how long it takes to memorize pre-existing words. Good documentation and management helps mightily. I'd rather train programmers than fight code translators.

That said, there are those who look at the mountain of existing applications and want to mine it. C to Forth translators exist and with some pre/post editing could produce code for the c18 core. How to distribute the application among 25 tiny computers would be a good thesis.

Quick question
by jd

I have often conjectured that multi-threaded processors (ie: processors that can store multiple sets of internal states, and switch between them) could be useful, as the bottleneck moves from the processor core to communications and dragging stuff out of memory.

(If you could microcode the "instruction set", all the better. A parallel processor array can become an entire Object Oriented program, with each instance stored as a "thread" on a given processor. You could then run a program without ever touching main memory at all.)

I'm sure there are neater solutions, though, to the problems of how to make a parallel array useful, have it communicate efficiently, and yet not die from boredom with a hundred wait-states until RAM catches up.

What approach did you take, to solve these problems, and how do you see that approach changing as your parallel system & Forth language evolve?

CM: The 25x could implement a multi-thread application nicely indeed. Except that most applications expect more memory that a c18 core has. Whereupon memory remains the bottleneck.

It's important to choose problems and solutions that avoid using off-chip memory. Even so, with 25 computers to support, I expect that every memory cycle will be utilized. The computer controlling memory can be smart about priorities and about anticipating requirements. For example, it could guarantee enough access to support display computers.

And the nice thing about memory-mapped communication is that a computer need not be aware of its environment. It's an ordinary Forth program accessing data asynchronously. Delays are invisible, as is synchronization. Of course, due care is required to avoid lock-up loops.

These conjectures are fun. But in a year we'll have real applications to review. And a much better appreciation of the advantages and drawbacks of so many tiny computers.

Programming languages...
by Midnight Ryder

This one would probably require a bit more time to answer than you probably have available, but a quick rundown would be cool: Where do you see programming languages headed -vs- where do you think they SHOULD be headed?

Java, C#, and some of the other 'newer' languages seem to be a far cry from Fourth, but are languages headed (in your opinion) in the proper direction?

CM: I've been bemused with the preoccupation of new languages with text processing. I've been accused of not providing string operators in both Forth and colorForth. Indeed, I haven't because I don't use them. Editing a file to pass on to another program never struck me as productive. That's one reason I chose pre-parsed source, to break the dependence upon strings and text processors.

Languages are evolving, as evidenced by the new ones that arise. But as with natural evolution, the process is not directed. There is no goal to approach nor any reward for approaching it. But whatever progress you might perceive, I don't. New languages seem only to propose new syntax for tired semantics.

These languages are all infix. Which is extraordinarily clumsy for anything but arithmetic expressions. And even those are comfortable only because we learned them in Algebra 101. Do you remember the learning curve?

Does everyone really think that 50 years into the computer age we have hit upon the ultimate language? As more and more C code accumulates, will it ever be replaced? Are we doomed to stumble over increasingly bloated code forever? Are we expecting computers to program themselves and thus save civilization?

I'm locked in the Forth paradigm. I see it as the ideal programming language. If it had a flaw, I'd correct it. colorForth uses pre-parsed source to speed and simplify compilation. This solves a non-problem, but it's neat and worth exploring. At least it proves I haven't gone to sleep.

What about memory protection?
by jcr

From the web pages, I don't see any mention of access control.

Can this processor be used in a multi-user, general-purpose mode?

CM: If you had a chip, you'd physically control access to it. It doesn't make sense for another person to share your chip. He can get his own. Certainly an individual c18 has too little memory to multi-task. And I doubt 25 computers could run 25 tasks.

But the 25 computers can certainly perform more than one task. They have to share resources: communication buses, off-chip memory and interfaces. Access is negotiated by the computer in charge of the resource. There is no hardware protection. Memory protection can be provided by the access computer. But I prefer software that is correct by design.

Communication with other computers, via internal or external buses, is subject to the usual problems of scheduling, routing and authentication. Internally, at least, my goal is to minimize delay rather than attempt protection. I anticipate spectacular crashes while software is developed. (Have you ever crashed 2500 computers?)

Where is forth going?
by JanneM

I learned forth early on in my programming career; it was very memory and CPU efficient, something that was important on early microcomputers. It was also a great deal of fun (though far less fun to try and understand what you wrote a week earlier...). Today, even small, cheap microcontrollers are able to run fairly sophisticated programs, and it is far easier to cross-compile stuff on a 'big' machine and just drop the compiled code onto the development board.

Forth has (in my eyes) always been about small and efficient. Today, though, embedded apps are more likely to be written in C than in forth, and the "OS as part to the language" thing isn't as compelling today as it was in the eighties. Where is forth being used today, and where do you see it going in the future?

CM: Forth is being used today as it always has been. In resource-constrained applications. I think they will always exist. I'm creating some with the tiny c18 computers in the 25x. I imagine molecular computers will be limited when they first appear.

Personally, I don't mind losing a mature market that can afford abundant resources. Such applications aren't as much fun. But Forth isn't restricted to small applications. Even with huge memories and fast processors, small, reliable programs have an advantage.

The major project cost has become software, to the dismay of managers everywhere. On-time, bug-free software is the grail. Forth doesn't guarantee it, but sure makes it easier. Will this ever be convincingly demonstrated? Will management ever value results over procedures?

The currently popular language is selected by uninformed users. The only thing in favor of such democratic choice is that it's better than any other. But why would anyone want to debug 1M lines of code instead of 10K?

What's the next Big Computational Hurdle?
by DG

Now that sub-$1k computers are running in the GHz range, it seems that all the computational tasks on a common desktop system are not processor-bound.

3D, rendered-on-the-fly games get well over 30 frames per second at insanely high resolutions and levels of detail. The most bloated and poorly-written office software scrolls though huge documents and recalculates massive spreadsheets in a snap. Compiling the Linux kernel can be done in less than 5 minutes. And so on.

It seems that the limiting speed of modern computers is off the processor, in IO. What then, do you forsee coming down the pike that requires more processor power than we have today? What's the underlying goal you intend to solve with your work?

CM: Memory is cheap. I don't mind wasting memory as long as it's not full of code that has to be debugged.

Likewise, Mips are cheap. The trick is to find productive ways to waste them. A Pentium waiting for a keystroke isn't very clever.

So here's a huge pool of Mips. What can you do with them? Voice recognition comes instantly to mind. Image recognition close behind. The brain deploys substantial resources to these tasks, so I suspect a computer must.

IO is indeed a bottleneck, but not in principle. If you can't get data from the camera to the computer, combine them. Put the image recognition algorithms in the camera. Analyse, reduce, compress data at the source. Meanwhile, it helps to have multiple paths off-chip.

revolutionary
by rnd

What is the most revolutionary (i.e., it is scoffed at by those in control/power) idea in the software industry today? Explain how this idea will eventually win out and revolutionize software as we know it.

CM: Forth! But then I haven't been out looking for revolutionary ideas. I like the phrase Baldrson used above: compressing ideomatic sequences. If you do this recursively, you obtain a optimal representation. I see no way to get a more compact, clear, reliable statement of a problem/solution.

Forth clearly revolutionizes software as most know it. It could lead to efficient, reliable applications. But that won't happen. A mainstay of our economy is the employment of programmers. A winnowing by factor 100 is in no one's interest. Not the programmers, the companies, the government. To keep those programmers busy requires clumsy languages and bugs to chase.

I don't have to be glib or cynical. Those are facts of life. Society must cope with them. But I don't have to. Nor you. There are niches in which you can be creative, productive, inspired. Not everyone can be so lucky.

Forth as intermediate language
by Ed Avis

Many high-level languages compile into C code, which is then compiled with gcc or whatever. Do any use Forth instead? I understand Forth is a stack-based language: doesn't that present problems when compiling for CPUs that mostly work using registers?

CM: I remember my shock at learning that Fortran compiled into Assembler, that then had to be assembled. A language that can be translated into another is clearly unnecessary. Truely different languages cannot be translated: C into Lisp.

Forth would make a fine intermediate language. But why have an intermediate language? It introduces another layer of confusion and inefficiency between the programmer and her computer. Macros were invented to support compiling directly to machine code.

Stacks are a compiler-friendly construct. Every compiler has to use one to translate infix notation to postfix. If this stack solution has to be assigned to registers, it's an extra step. Forth uses stacks explicitly and avoids the whole subject.

Register-based CPUs have more problems than just the complexity of their compilers. Their instructions must contain register addresses, which makes them longer and programs bigger. And it is rare that every register can be used in every instruction.

Moreover registers need to be optimized. After assigning system registers, performance depends on how well the remaining registers are handled. Compilers can optimize infix expressions better than humans. But such expressions are no longer the preferred means of doing arithmetic. DSPs and super-computers integrate difference equations.

Design guidelines encourage code with many subroutine calls each with only a few arguments. This is the style Forth employs. But it plays havoc with optimization, since register usage must be resolved before each call. So apart from being unnecessary and difficult, optimization has no effect on good code.

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

211 comments

Yeah! (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2299545)

I won!

I am (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2299552)

so first its not even funny

this is a fp suck it down american pigdog (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2299565)

i rule, you suck - understand?

i hope so...

till your country gets flattened (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2299624)

no one cares about shitty terrorists who aren't man enough to fight like real men

Re:till your country gets flattened (0)

Anonymous Coward | more than 12 years ago | (#2299980)

like smart bombing?

oh, the bravery of being out of range...

NEXT anyone (0)

eadint (156250) | more than 12 years ago | (#2299568)

whatever happened to next i though that was going to be the end all . 100% object oriented. ect ect. why not just work with whats there. im tired of having to scramble and learn new languages. hey why don we make perl OOP i mean true OOP not that bolted on crap. linux G4 perl/OOP NEXT hey what a deal or maybe clawhammer if they can get their act together

JON KATZ MUST GO (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2299585)

Am I the first to post that here ?

Stephen King, author, dead at 54 (-1, Redundant)

Anonymous Coward | more than 12 years ago | (#2299596)


I just heard some sad news on talk radio - Horror/Sci Fi writer Stephen King was found dead in his Maine home this morning. There weren't any more details. I'm sure everyone in the Slashdot community will miss him - even if you didn't enjoy his work, there's no denying his contributions to popular culture. Truly an American icon.

Re:Stephen King, author, dead at 54 (-1, Offtopic)

bliss (21836) | more than 12 years ago | (#2299613)

" just heard some sad news on talk radio - Horror/Sci Fi writer Stephen King was found dead in his Maine home this morning. There weren't any more details. I'm sure everyone in the Slashdot community will miss him - even if you didn't enjoy his work, there's no denying his contributions to popular culture. Truly an American icon."

any confirmation?

Re:Stephen King, author, dead at 54 (0, Offtopic)

dmorin (25609) | more than 12 years ago | (#2299625)

No, this guy posts this a few times a day. Has been doing it for weeks. I don't know why he thinks it's funny.

Re:Stephen King, author, dead at 54 (0, Offtopic)

Water Paradox (231902) | more than 12 years ago | (#2299861)

Probably similar to the reason I think it's very funny that YOUR post could accurately be applied to the content of many slashdot posters. :-)

Re:Stephen King, author, dead at 54 (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2299632)

dear lord have you just been trolled!

Re:Stephen King, author, dead at 54 (-1, Troll)

Anonymous Coward | more than 12 years ago | (#2299687)



Yes, an article can be found right here. [128.121.12.52] I'd like to say I'm shocked, but with all the other news going on "shocked" hardly seems appropriate.

Re:Stephen King, author, dead at 54 (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2299699)

Yes, but you have had an ID-10-T error, so you will not be able to view the reference.

Re:Stephen King, author, dead at 54 (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2299774)

Dear bliss,
Kudos on your most excellent meta-troll!
Truly you "get" what slashdot is all about.

However... (1, Offtopic)

jd (1658) | more than 12 years ago | (#2299958)

It was later discovered that he gets resurrected in the sequel, due out tomorrow.


DUH! This troll has been around for a while. Why can't the terrorists have blown them up, instead? At least it would have produced one positive outcome.


Please, if you read a public weblog, like Slashdot, K5, etc, you are going to get a LOT of bogus "reports", trolls, spam, hate-gunk, etc. When reading, the important thing is to look at the source, and even if the source is "trustable", see if there's anything confirming the claim elsewhere.


I'll use myself as an example. Before the karma cap, I had reached 720 points, with a typical increase of around 10 karma/week. At first glance, that's a fairly respectable total. On the other hand, when you realise I can easily reach a figure of around 40 posts in a week, a more accurate picture emerges. An average of one point of karma for every four posts, which works out to two points for every eight posts.


In other words, for all my "impressive" total, I was still only writing stuff that people found to be interesting, useful AND verifiable, only one time in eight. That leaves seven times in eight that my information is either of little worth or just plain wrong.


So, if I can be wrong 7 times in 8, so can anyone else. THAT is why you need to verify. If you can't find two genuinely independent sources who say the same thing, then you probably want to treat the information with suspicion. Ideally, you want three or four, before accepting even a single thing.


Of course, there are going to be real innovative thinkers, who see things that others can't. But you still should check. In this case, you might want to find out their reasoning and see if it makes sense. The old adage that "truth is beauty" is usually worth keeping in mind. Crick & Watson "discovered" DNA by simply building models until they had one that was beautiful. Actually figuring it all out never crossed their minds.


The same is true with any "innovative" thinking. Where you just can't check the reasoning, don't have the time, or knowledge necessary, simply look to see if it has an inner beauty. If it does, there's probably -something- to it. If it doesn't, it's most likely waaay out there.

Re:However... (0)

Anonymous Coward | more than 12 years ago | (#2300049)

what the hell are you talking about?

misplaced post I guess.

Re:However... (0)

Anonymous Coward | more than 12 years ago | (#2300290)

I dunno, the acclaim of the great body of your peers would brand you as understandably mediocre, not brilliantly insightful.

Albert Einstein was severly bitchslapped for years until the world finally caught up with him.

Apparently what you should look for is some poor schlub that occasionally gets a few positive moderations, but must of the time is stomped under foot by the tyranny of the slashbots. Like me!

Re:Stephen King, author, dead at 54 (0, Offtopic)

kaxman (466911) | more than 12 years ago | (#2299670)

Well. What the hell am I supposed to do now??? He didn't finish his goddam dark tower series. I'm sad, but dammit... grrrrrr...

I really enjoyed many of his books, but he should have focused on his, imho, best series of books.

kaxman

Re:Stephen King, author, dead at 54 (-1, Troll)

Anonymous Coward | more than 12 years ago | (#2299708)

u r fscking stoopid

Re:Stephen King, author, dead at 54 (1)

Rudeboy777 (214749) | more than 12 years ago | (#2300008)

As others have pointed out, this is a notorious troll. However, I'm a big Dark Tower fan too and didn't know about this until 2 weeks ago (maybe you haven't seen this either?) It's the prologue to his next Dark Tower book [stephenking.com]. Sounds like a really cool story too, although like the last one, it won't advance the main plot too much.

chuck moores involvment with terrorism (-1, Troll)

Anonymous Coward | more than 12 years ago | (#2299608)

if you check on CNN right now they are reporting that chuck moore has strong ties to a sect of the islamic jihad that may have taken part in the bombing. moore was found with piloting manuals in arabic, several semi-automatic rifles, and a varity of extremist islamic propaganda when his house was searched. he has been brought in for questioning reguarding the recent events surrounding the terrorist destruction in downtown manhatten.

The Blind (3, Interesting)

Kallahar (227430) | more than 12 years ago | (#2299612)

"Now, with some computer experts estimating that over 50% of the Internet is incomprehensible to braille interfaces, and most computer operating systems devolving to caveman interfaces ("point at the pretty pictures and grunt") we seem to be ready to take the next step - disenfranchising the merely color-blind."

This brings me back to the BBS days. One of the best people in the area was a blind judge. He used a text->speech program which allowed him to do everything on the BBS that everyone else did. Since the BBS's were all text-only anyway the interface was easy. Nowadays we have so many sites that are design-centric that I can't see how people with disabilities get around.

I always strive to keep my sites simple and clean (like slashdot) so that the site can be more easily used by anyone, anywhere.

This isn't to say that flash, etc. shouldn't exist, but I don't think that they belong on a business-oriented site.

Travis

I imagine it this way (0)

bliss (21836) | more than 12 years ago | (#2299638)

Being blind really ruines most of your life. At least from what I am able to determine. Generally I can't see how you can do anything but stay on disability without some really magical help

Re:The Blind (1)

reynaert (264437) | more than 12 years ago | (#2299784)

I always strive to keep my sites simple and clean (like slashdot) so that the site can be more easily used by anyone, anywhere.

I can tell you never looked at Slashdot in Lynx.

Incomprehensible content (5, Funny)

heatsink (17798) | more than 12 years ago | (#2299647)

"Now, with some computer experts estimating that over 50% of the Internet is incomprehensible to braille interfaces"

Isn't this because over 50% of the Internet is porno?

Re:Incomprehensible content (1)

kaxman (466911) | more than 12 years ago | (#2299659)

It has to be way more than that! It's everywhere, in every single part of the internet!

Re:Incomprehensible content (2)

Ghoser777 (113623) | more than 12 years ago | (#2299703)

That, and probably because half of them also don't look right in links. I can't imagine how difficult it must be for a blind person to try to navigate sites with multiple frames and poor info ordering. I mean, there's no big candy-coated indicators for braile users to look for when they're scanning a website via a braille interface. But web designers probably don't take this into account because their managers usually just want to see cool designs that they can see and that are easy to navigate for them (aka usually people who can see).

F-bacher

Re:Incomprehensible content (2)

jd (1658) | more than 12 years ago | (#2299748)

Probably right. In which case, it's incomprehensible to any device based on logic.

Re:Incomprehensible content (1, Funny)

Anonymous Coward | more than 12 years ago | (#2299801)

?

Why can't logical devices comprehend porn? The reason people look at porn is because it turns them on. We'll define this as "good." Then it's only logical to conclude that if a logical device understands what's good, then it can understand porn.

Kill All Muslims. Death to Islam. Kill All Arabs. (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2299653)

The dead scream out for justice:
  1. Kill all Arabs.
  2. Kill all Muslims.
  3. Kill all Mohammedans.
  4. Kill all Towel Heads.
  5. Kill all Camel Jockeys
  6. Kill all Islam.
  7. Nuke their countries to hell.
  8. Nuke them again.
  9. Death to Islam.

I piss on Mecca. I wipe my ass with the Koran. I spit upon Mohammed.

Re:Kill All Muslims. Death to Islam. Kill All Arab (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2299683)

amen

I'm impressed (-1, Redundant)

Uttles (324447) | more than 12 years ago | (#2299667)

Excuse me for the lack of content in this post but:

This guy is really smart.

Re:I'm impressed (-1, Troll)

Anonymous Coward | more than 12 years ago | (#2299700)


and his sister is a slut!

Forth is the Language of Robot AI (1)

Mentifex (187202) | more than 12 years ago | (#2299701)

The language that Chu Moo created for the control of telescopes in astronomy became so valuable for use in personal robotics that Mind.Forth for robots has resulted at the http://sourceforge.net/projects/mind/ [sourceforge.net] website where more than three hundred Open Source AI projects are rushing to introduce real artificial intelligence. Mind.Forth has a companion version in MSIE JavaScript at http://mind.sourceforge.net/ [sourceforge.net] -- with Tutorial and hard-copy Troubleshoot print-out options.

WTC (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2299704)

Why don't americans fuck (arabs/palestinians/ragheads/dune coons/sand niggers)? (select according to taste)

They lose their erections...

Chuck is not going to be a useful visionary (3, Insightful)

watanabe (27967) | more than 12 years ago | (#2299707)

Chuck is an unabashed Forth zealot. I guess this is fine; I'm sure Forth is great. But, I get the feeling he is so focused on the details (Commenting that programs don't need to read text to do something useful? Hello? How was this slashdot interview pulled from a database and sent to my web browser?) that I feel like reading stuff from him will only be interesting if I want to know something about Forth. This, compared to Neal Stephenson, say.


I even get the impression that Chuck would be happy that I noticed this about him. It just makes me think that there will be no revolution coming from the Forth camp. Which, I hadn't really expected, anyway. But, I'll cross it off my list of possible revolution starters, anyway.

i can't believe you! (0)

Anonymous Coward | more than 12 years ago | (#2299897)

i can't beleive you douchebag morons who upvoted
this troll whore! oh wait, i forgot you moderators all worship perl, so its okay to say "(any.language.but.perl.here) sucks!"

WEAR RED WHITE AND BLUE ON MONDAY (OFFTOPIC) (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2299716)

Spread the message!

Compression (4, Insightful)

donglekey (124433) | more than 12 years ago | (#2299720)

One of the things that can never get enough power is compression. Right now the next generation of image and video compression looks like JPEG 2000 and Motion JPEG 2000. I have tested it and it seems like a miracle compression, it consistently works at least 4x better than jpeg compression. Apply that to video and you have something incredible but very VERY expensive. He said it was a solution looking for a problem, and there it is. I think that HDTV video disc standards are leaning toward mpeg4 which would be a mistake in my opinion. Motion JPEG 2000 would be much more forward thinking if someone could pull it off. I say go for it Chuck.

More immediatly, using a 25x chip in a digital camera to compress large pictures to JPEG2k could save lots of space and more importantly, quality. Seems like a perfect mariage to me!

Re:Compression (2)

cruelworld (21187) | more than 12 years ago | (#2299759)

Mpeg4 is not Divx!!

MPEG4 is an incredibly complicated scene graph compression scheme. Currently people are merely using it a transport stream format with mpeg1 video compression.

A real mpeg4 encoder has to be content aware, and right now that's beyond our abilities(and hardware).

Re:Compression (1)

donglekey (124433) | more than 12 years ago | (#2299804)

Can you give me some links to read? I guess I have made this generalization and I am sure others have as well.

What I like about Chuck Moore (5, Insightful)

Angst Badger (8636) | more than 12 years ago | (#2299735)

The thing I have always liked best about Chuck Moore is that, whether you agree with him or not on a particular point, his ideas are always interesting and original. He's not afraid to follow his own judgment wherever it leads, and while he may perhaps end up following more blind alleys than a conventional thinker, it's people like him who will also make the most breakthroughs. In this day of C++/Java/XML/insert-other-orthodoxy-here, it's good to have someone like Chuck Moore around to remind us that computing can still be exploratory and experimental, and that you can still make a living without following the herd.

talk about colorForth (1)

n8willis (54297) | more than 12 years ago | (#2299761)

I'll tell you one thing: he sure has Forth-colored glasses on....


I'm not bashing Forth. I think zeal can be a healthy thing. But, you know, zeal about a concept, not about one isolated tool/approach.

Nate

Re:talk about colorForth (0)

Anonymous Coward | more than 12 years ago | (#2299955)

forth _is_ a concept.

this is why you will never see real object
oriented forth (rather than OO tacked on
forth ); forth does not need OO, it
_supercedes_ object orientation.
it has all the advantages of OO without
the excess complexity.

Hmmm (5, Insightful)

Reality Master 101 (179095) | more than 12 years ago | (#2299781)

Forth clearly revolutionizes software as most know it. It could lead to efficient, reliable applications. But that won't happen. A mainstay of our economy is the employment of programmers. A winnowing by factor 100 is in no one's interest. Not the programmers, the companies, the government. To keep those programmers busy requires clumsy languages and bugs to chase.

To be honest, to me this invalidates everything else he said. If you have to depend on a conspiracy to figure out why your pet language is not universally adopted, then you are not living in reality.

I used Forth a long time ago. In fact, I advocated using Forth for the game company I worked at because I liked its simplicity and compactness. But I realize now that the practical measure of a language is how easy it is to maintain it... and Forth is not that language.

It kind of reminds me of APL zealots (yes, there used to be those, and there probably still around in hiding). They claimed much of the same things... that APL should be the language that everyone uses (I remember someone trying to convince me that APL would be a great language for an accounting system). They would NEVER admit that APL was hard to maintain.

I think this guy needs to pull his head out of the clouds and realize that there just might be reasons other than conspiracy that Forth is not more widely used. Forth had its time in the sun, and it was eventually rejected.

Re:Hmmm (1)

spoggle (319631) | more than 12 years ago | (#2299814)

I agree with you to an extent, but Forth hasn't been rejected at all!

Forth is in broad use for very low-level embedded stuff - doesn't every Sun computer have Forth embedded in it. FreeBSD uses Forth in it's boot process, and I'm certain that there are others...

Re:Hmmm (1)

Reality Master 101 (179095) | more than 12 years ago | (#2299883)

I agree with you to an extent, but Forth hasn't been rejected at all!

I should have said that it's been rejected as a "general purpose language". I think Forth has a lot of value in embedded applications.

Re:Hmmm (3, Funny)

Viadd (173388) | more than 12 years ago | (#2299847)


They would NEVER admit that APL was hard to maintain.

But you don't have to maintain an APL program. If, for instance, there was a bug in the operating system kernel, and it were written in APL, you would just re-write it. How hard is it to rewrite one line of code?

Re:Hmmm (0)

Anonymous Coward | more than 12 years ago | (#2299976)

"
To be honest, to me this invalidates everything else he said.
"

To be honest, to me you are a douchebag troll whore.

Re:Hmmm (4, Insightful)

William Tanksley (1752) | more than 12 years ago | (#2300046)

To be honest, to me this invalidates everything else he said. If you have to depend on a conspiracy to figure out why your pet language is not universally adopted, then you are not living in reality.

Read again. He did NOT claim any form of conspiracy; he simply identified common interest. There IS a common interest in all those sectors in staying employed.

At the same time, there's a more powerful common interest at stake: making money. It's better to make money than to stay employed (in a corporation which refuses to change to a more efficient concept, and which will therefore soon lose business).

So it's clear to me that Chuck's analysis is simplistic. But so is yours -- at least he's correctly identified a problem.

The real problem with Forth is that it achieves its successes by being fundamentally different. The theory behind Forth is totally at odds with the theory behind all other languages, with the exception of Postscript and Joy. And until the last couple of years, nobody had done ANY work on understanding that theory, and from that understanding coming up with simple ways to explain what makes a Forth program "good" or "bad".

Thankfully, we now have some work done on that, and I believe that the clear result is that all modern Forths are screwy. They encourage writing code which is bad, and therefore hard to maintain. This isn't the fault of Forth; it's the fault of the vocabulary Forth now has. The success of C wasn't due to the language (although it was good, it was only a minor improvement over previous languages); it was the teaming of the language with the library and runtime (AKA Unix).

Work is ongoing... See the concatenative languages group [yahoo.com] for more discussion and a very informative link.

As for APL... We have one APL "zealot" (your word) on the concatenative languages list. He's an excellent help; he can see right through many of the trivial differences in syntax to the problem actually being solved. There's no doubt that APL is a great language, and provides a marvelous "transform", converting a problem from one domain (its native domain) to another domain (the array programming domain) in which it can sometimes more easily be solved. It's like a laplace transform -- a wonderful tool for problem solving. One doesn't maintain a laplacian transform when the problem changes; one simply reworks the math.

Forth is different; in Forth you don't transform the problem into a form which fits the language; instead, you transform the language into a form that fits the solution domain.

-Billy

Re:Hmmm (1)

elmegil (12001) | more than 12 years ago | (#2300233)

He effectively said there was a conspiracy to perpetrate bugs, which there isn't. Beyond that, what the other poster said makes a lot of sense. I like Forth too, and have tried to do some personal project in it. But I find that when I inevitably get distracted, and return to the projects, trying to re-read the code and figure out where exactly I was is damn near impossible.

This is not to say that you can't write code in other languages that is just as incomprehensible, but I find that I can pick up a perl script I wrote 2 years ago and read it right away. I can't do that with Forth. It would take a lot of effort, and continuous programming (which I don't do in perl either) to remain fluent enough in Forth to do that. And since I only have these intermittent projects written in it....it's not worth the hassle.

Like it or not, Forth is not the most easily maintained language out there, and there are languages that do lend themselves more easily to long term maintainability.

APL Accounting System (1)

kgrgreer (448416) | more than 12 years ago | (#2300047)

I've seen a complete accounting system written
in APL. I've also seen a complete banking system
also written in APL and deployed across tens of
small finanical companies and hundreds of sites.
Both worked well.

Re:APL Accounting System (0)

Anonymous Coward | more than 12 years ago | (#2300286)

The question is not whether you can, it's whether you should.

Re:Hmmm (1)

dbremner (16330) | more than 12 years ago | (#2300097)

It kind of reminds me of APL zealots (yes, there used to be those, and there probably still around in hiding). They claimed much of the same things... that APL should be the language that everyone uses (I remember someone trying to convince me that APL would be a great language for an accounting system). They would NEVER admit that APL was hard to maintain.

Nowadays most of us APL zealots have moved onto J, K, or A+ all of which have most of the traditional procedural primitives, for, while, do, if, etc. There are many programming tasks where maintenance is not an issue but speed of development is, for example, rapid prototyping. Structured and Commented APL is not hard to maintain if written by a competant programmer.

Re:Hmmm (2)

Animats (122034) | more than 12 years ago | (#2300135)

If you have to depend on a conspiracy to figure out why your pet language is not universally adopted, then you are not living in reality.

Good point. Forth has been around for a long time, and it's never really caught on. I've written a few thousand lines of it myself, for an embedded application. It's not that there's any big opposition to it, or that people can't understand it; it's that it just isn't all that great.

Big arrays of little machines aren't that useful either. That approach has been tried; the hardware can be built, but few problems fit that model well. The ILLIAC IV, the Hypercube, the BBN Butterfly, and the Connection Machine all fit that model, and they were all dead ends. The two extremes of that spectrum, shared memory multiprocessors and clusters of networked machines, are both useful. Nothing has been found in the middle that's equally useful.

The trouble with stack machines is that they have a worse Von Neumann bottleneck than register machines. Everything has to go through the top of the stack. It's probably possible to get around that with a superscalar architecture. (The FPU in x86 machines is a stack architecture, yet all modern implementations are able to get a few instructions going simultaneously.) But Moore would rather have lots of dumb processors than superscalar ones.

Re:Hmmm (2)

ansible (9585) | more than 12 years ago | (#2300163)

The trouble with stack machines is that they have a worse Von Neumann bottleneck than register machines. Everything has to go through the top of the stack.

Hmmmm.... I wonder about that. If you designed a CPU for Forth (or another stack-based language), you could have something where the first, say, 10 elements of the stack are in registers, and the rest in memory. As the stack grows, the bottom elements are moved off into main memory. Vice versa for shrinking. If done cleverly, there wouldn't be much need for copying data around.

Of course, I'm sure someone's already thought of this...

Re:Hmmm (1)

bentini (161979) | more than 12 years ago | (#2300274)

Oh? Really? You think you could do that?
Learn about CRISP by Ditzel et al. from Bell Labs. Damn near 20 years ago, amazingly efficient stack based RISC processor.

Re:Hmmm (2)

William Tanksley (1752) | more than 12 years ago | (#2300197)

The trouble with stack machines is that they have a worse Von Neumann bottleneck than register machines. Everything has to go through the top of the stack.

At the same time, though, because you don't have a register file, you don't have to have an instruction decode stage. As a result, your instruction cycle time goes down.

The cycle time needed for these machines is truly amazing. I built one back in college, and outperformed every other chip in the class.

-Billy

Intriguing (2)

jd (1658) | more than 12 years ago | (#2299787)

The replies are fascinating insights. It's a pity he didn't spend more time on each one, but he DOES have a "paying" job & real life, too. I guess that kind-of limits people on the time they can spend on an interview.


This is NOT a critisism, though, more wishful thinking. What was said was fascinating, and there is a lot there to think over. Doubly so, since it IS just the tip of the iceberg.

Marginalizing the Blind (5, Interesting)

skroz (7870) | more than 12 years ago | (#2299791)

I have no qualms about requiring color vision for programmers. Everyone does not need to be a programmer.
Well thanks, Chuck. I'm not completely blind, but close enough that your colorForth is inaccessable, as is most web content. Thanks for telling me that I don't have to be a programmer. Guess what? I want to be, buddy. And while I don't think I'm the best coder that ever walked the earth, I think I can get the job done. But thanks to people with dismissive views such as yours, it's becoming harder and harder every day to do what I enjoy: coding. It's bad enough that most developers don't consider the needs of people with disabilities, but to hear someone who has considered then DISMISSED those needs is truly disheartening.

So right (2)

Ghoser777 (113623) | more than 12 years ago | (#2299829)

If he ever refused to hire somebody based on their inability to see color, I bet he'd lose in Court, lose bad too.

Chuck would have to prove that the ability to percieve colors in MANDATORY to coding, which it is not. It's understandable that people in wheel chairs don't run marathons, because a prerequisite to running is having legs. The only prerequisite for programming is a brain that contain knowledge of the language and some way to relay thoughts. There's braile keyboards for th second, and I'm assuming the previous poster has a brain.

If guys are successfully sueing Hooters to be able to work there (actually I haven't heard about this ina while, does anyone have an update?), then blind programmers could defintey when this case.

F-bacher

Re:So right (0)

Anonymous Coward | more than 12 years ago | (#2300041)

I think you misunderstand. It is required to see color to work with a language that requires color. It's not required for you to see color for languages that do not.

If you disagree, don't use colorForth. Stop trying to make everyone else need a braile keyboard to get THEIR work done.

Re:So right (2)

Ghoser777 (113623) | more than 12 years ago | (#2300059)

Italics, bold, underline, superscript, subscript, etc could all be use as substitues for colors. This is not unreasonable.

F-bacher

Re:So right (2)

Genom (3868) | more than 12 years ago | (#2300053)

If he ever refused to hire somebody based on their inability to see color, I bet he'd lose in Court, lose bad too.

If it was for a generic programming job, I'd agree. For a generic job in *any* field, I'd agree.

If he was hiring for a specific job, which involved programming in a language where color was a key element, it could be a stated requirement that the applicant be able to see color.

In that case, I think he'd prevail -- simply because it's a stated requirement.

The same as if there was a manual assembly line job that involved seperating piles of red and green items that were otherwise identical - I think that the job would, in part, require the applicant to be able to distinguish red from green.

In a generic setting, the ability to see color has no relevance to programming -- but in certain niche fields or jobs, it might.

As for the original poster - I'd be put off too, if I was in your situation. But it would only be a minor setback. Perl, C, and a myriad of other great languages don't require color vision (or any vision at all - although it might be hard to produce GUI apps without sight)

Re:So right (1)

The Panther! (448321) | more than 12 years ago | (#2300246)

If he ever refused to hire somebody based on their inability to see color, I bet he'd lose in Court, lose bad too.

I applied for a job at Motorola back in '93. They had two open positions, one in the wafer fab and another in the test department. The fab meant wearing bunny suits and dealing with etch chemicals; the test department meant endless boredom and $2/hr less. I wanted the money, but I could not get the job because I am severely color blind (11/13 color plates failed) and it requires acid green scale reading to determine chemical grades.

The morale of the story? Yes, you can "discriminate" between your employees based on their aptitudes, skills, and physical abilities if they come in direct conflict with their ability to perform the specified job. There's nothing illegal about it, and in most cases, is completely sensible.

I was still miffed about the pay differential, though, and my inability to do the job that paid more. Such is life.

Re:Marginalizing the Blind (2)

donglekey (124433) | more than 12 years ago | (#2299833)

Not everything can me made accesible to everyone and that is his point. He isn't going to rethink his design because some people can't see well. He didn't mean that blind people shouldn't program, he meant blind people don't need to program forth. He said that he knew lots of good blind programmers.

I do a lot of 3D animation. Why don't you go after Alias|Wavefront or Newtek for not making their products accesible to the blind. Your lack of sight is an obstacle that you will have to overcome, and it will limit you in some things. Just accept it. You can't use color forth, its not a big deal. You probably shouldn't become a photographer, animator, cinemetographer or airline pilot. Def people won't become sound engineers. That's just the way it is and people aren't going to limit themselves because someone somewhere might be offended. That's how we get politicians not progress.

Re:Marginalizing the Blind (2)

CrosseyedPainless (27978) | more than 12 years ago | (#2300076)

Just accept this: he's gratuitously adding the requirement for color vision to programming. Would you be so accepting if someone added an unnecessary and mostly worthless requirement to your job description, one you physically can't acquire, and then said, "Oh well, not everyone needs to be able to do that job, anyway."?

No, the inability to use colorforth is not a big deal. In fact, given the total irrelevance of any Forth to the world, it's no deal at all. My objection is not to colorforth, but to you and Chuck's "Fuck'em" attitude. That's far more offensive than an unused, needlessly colorized computer language. But, I'll take your advice and just accept it.

--unoffical spokesman, Colorblind Computer Programmer's Association

Re:Marginalizing the Blind (1)

noc (97855) | more than 12 years ago | (#2300193)

I agree with you that it's his attitude and the offensive "well, not everyone needs to be a computer programmer" comment that's the problem.

His actions on the other hand, I have no problem with. He's trying to come up with a better system for himself: encoding meaning in the color of words. He pointed out himself that this same meaning could be given to things like typeface, volume, etc. It would be a problem if he came up with inaccessableForth, which gave him a slight productivity gain, then started advocating its use at the exclusion of people perfectly capable of computer science, just not his language. It would be possibly justifiable if he (and everyone using his system) got an order of magnitude productivity increase, IMO.

Re:Marginalizing the Blind (1)

Modab (153378) | more than 12 years ago | (#2300038)

Please observe what he said:
I'm sure colorForth will be translated into these other representations. I, myself, will be exploring spoken colorForth. (As soon as I can decipher PC sound cards.)



This man is simply saying that he wrote colorForth for his needs. He would not write code that he would not use. Why should he? It is refreshingly frank and honest. Someone else who is color blind can write code for themself.
He is not against people with disability, but he is not your shepherd, and you are not a sheep.

Re:Marginalizing the Blind (2)

William Tanksley (1752) | more than 12 years ago | (#2300070)

You're taking him badly out of context. His Colorforth is indeed inaccessible to you, but as he's repeatedly stated, color isn't the important things -- he mentions tone, font, and emphasis as alternatives here.

He's only working with color right now, because that's what he needs; why should he do extra work to solve a problem nobody has? Hire him (or have him hire you), and he'll find a way for you to use ColorForth.

-Billy

Re:Marginalizing the Blind (1)

LazyDawg (519783) | more than 12 years ago | (#2300207)

He said shortly afterward that color is just his own method for distinguishing different types of code, that you can change the font or face to distinguish different colors too. But yeah, saying that blind people don't have to be programmers is mean :)

fanatics (1)

spoggle (319631) | more than 12 years ago | (#2299799)

I get very uncomfortable around fanatics like Chuck - I've been an engineer for a long time and I usually find that anyone with only one tool is going to try and redefine all problems to be fixable with that tool. I've used Forth for years, but I also use C/C++, Perl, Lisp, etc. Each tool for a different problem. Forth has a well-defined and useful niche (it's longevity proves that), but it's far from something that 99% of us will ever use.

That said, his processor is really cool looking - wish I had some to play with. I can think of a lot of problems that could be solved with these and Forth. But I think that the fanaticism here will put off many backers...

Re:fanatics (0)

Anonymous Coward | more than 12 years ago | (#2299813)

I agree, all fanatics must die.

But you know how the old adage goes : "If the only tool you have is a cruise missile, every problem looks like Nagasaki".

Re:fanatics (0)

Anonymous Coward | more than 12 years ago | (#2300023)

"
I get very uncomfortable around fanatics like Chuck
"

I get very uncomfortable around douchebag troll whores like you.

Re:fanatics (2)

William Tanksley (1752) | more than 12 years ago | (#2300098)

The most important part of Forth isn't a tool -- it's a concept. And most of you HAVE used it, although you deny it; most of you use the concept which powers Forth every time you print something. The same concept that powers Forth also powers Postscript.

-Billy

Color Vision and programming (2)

Jeremy Erwin (2054) | more than 12 years ago | (#2299812)

cm said " 20-20 vision is required for fighter pilots. I have no qualms about requiring color vision for programmers. Everyone does not need to be a programmer."

I would disagree. While the Free Software Foundation, for instance, does not explicitly condone "programming regardless of skill"-- contributing to gcc does require some aptitude, free software allows anyone to program--without regard to financial means, or the willingness to sign NDA's.

Before color forth, color vision was not a prerequisate for programmers. Why should it be now? (Why are boldface, italic, and roman not appropriate analogues for red green and yellow, anyway?)

You don't _have_ to use ColorForth to code w/Forth (2)

Svartalf (2997) | more than 12 years ago | (#2299944)

There's tons of other Forth dialects that actually meet his criteria of being a true Forth- and they run under every OS out there from DOS all the way to Linux.

ColorForth is his implementation of his idea of what Forth should be for him. If you can use it, fine. If not, find another Forth- I'm sure there will be other implementations that code for the x25 CPU at some point. People aren't using Strostroup's implementation of C++ or K&R's implementation of C either- for that very reason.

Go hit Taygeta Scientific [taygeta.com]'s website for implementations of Forth that you can try out. For Linux users, I suggest BigForth from Bernd Paysan, BTW- it's a native code generating implementation with some GUI support that shows some promise for making usable apps, etc.

Re:Color Vision and programming (0)

Anonymous Coward | more than 12 years ago | (#2300029)

Yup, he can't use color coding for his language because you can't see color. So what if it makes him more productive.

A 386 was required for running software! Why is it now? (says the owner of new software)

Well.. (2)

mindstrm (20013) | more than 12 years ago | (#2300087)

Boldface, Italic, and Roman *ARE* appropriate analogues, he already made reference to that.

Also.. I think what's he's saying is, why should everything on earth cater to the lowest common denominator? It shouldn't.

You don't need 20/20 vision to fly a plane. if you want to work for the Air Force in particular, to fly their jets, you have to have perfect vision. Period.

So.. if you want to work with Color Forth, as he implemented it, you need to be able to see colors. I fail to see how this is bad.

Everything relating to computers does not need to be built for the lowest common denominator.

Re:Color Vision and programming (2)

William Tanksley (1752) | more than 12 years ago | (#2300146)

(Why are boldface, italic, and roman not appropriate analogues for red green and yellow, anyway?)

They are, of course -- he said they were. He's using color now because that's what he started with; you can't deny that it's easier to work with than multiple typefaces.

-Billy

Designing the software first, then the hardware (2)

On Lawn (1073) | more than 12 years ago | (#2299818)

This guy designing the language and the chip is interesting to me.

Its mentioned in the technical manual for the Enterprise 1701-D that the software was designed long before the hardware. This seems unnatural considering the ease with which you can change software compared to hardware but there are advantages.

EROS for example is a OS that is struggling to apply some really cool ideas becuase there is not enough hardware support for its permission paradigm. Alternatively, it took MS over 10 years to implement all the hardware features built in to the 486 for OS's to use (not just the 32bit bus).

3d libraries are being written in hardware code now, after the attempt to do it in software couldn't handle the massively parallel nature and speed requirements. Now there are crypto cards that simularly add hardware designed functions to prop up where software is slow.

So what I wonder is, is it really so unfeasable or unreasonable to design the software first and then the hardware?

EROS (2, Interesting)

kip3f (1210) | more than 12 years ago | (#2299984)

I don't know where your comment about EROS [eros-os.org] comes from. EROS stands for Extremely Reliable Operating System, and has cool stuff like transparent persistence for all programs and a pure capability security system. EROS was built from the ground up to run on commodity Intel boxes. The OS is not ready for prime time because it is being re-written in C (from C++). It is GPL'ed, and it has mucho potential.

big fourth talk - where's the beef? (0)

Anonymous Coward | more than 12 years ago | (#2299822)

He repeatedly goes on about how fourth is so much more compact and efficient than c, pascal, and java.

Can he show a system with 1mm+ lines of fourth?

Don't give me the bs about rewriting a 1mm+ c program with 10k lines of fourth. How about taking a 25mm+ line c application and rewriting it in 1mm+ lines of fourth?

Can he show a system such as a 500 concurrent user database written enterly in fourth?

Maybe he should change is stock ' fourth is best programming language ' answer to 'fourth is bets embedded, small system language '.

I can already hear his answer 'why would I want to have a 500 concurrent user database system?'

Is the web server, os, etc which serves his home page written in fourth?

Re:big fourth talk - where's the beef? (0)

Anonymous Coward | more than 12 years ago | (#2299998)

"
He repeatedly goes on about how fourth is so much more compact and efficient than c, pascal, and java.
"

Do you want him to lie?
and BTW, the spelling is "Forth".

the rest of your arguments are either
incomprehensible or bullshit. you mar the
name 'anonymous coward'. douchebag troll
whore.

You often don't need anywhere near as many lines.. (2)

Svartalf (2997) | more than 12 years ago | (#2300018)

1+ million line programs do NOT mean useful complexity- in fact, one should question the application that actually needs that many lines of code. Furthermore, many people lump the lines of code together in a system, artificially inflating the numbers- almost a "my system's bigger and better than yours" competition.

Also of note is that few people have attempted to make a "500 user concurrent database"- ever. Have YOU ever written one? If not, why are you wasting your time posting here when you should be out flogging your product in competition with the likes of IBM, Oracle, etc.? I'm pretty sure if someone wanted to, they COULD make one in Forth (as there HAS been relational databases written in Forth...)

And his answer would be right for that- he doesn't need one for the problems he's solving. Doesn't mean someone else couldn't do it- it just means someone hasn't come along using Forth to do the task in question.

And in answer to your webserver question, it's very likely that nobody coding in Forth has gotten around to doing a Webserver as most of the people using Forth are doing embedded systems that don't have network connectivity, etc. If there was one, he might be using it.

Just because nobody's using it for that task doesn't mean that your choice is better for the task or that the language in question is poor for the task. Icon, a string and symbolic programming language from one of the inventors of Snobol is an ideal (as in, much better than C or C++) language for making compilers- nobody's using it because it's not well known and everybody and his dog follows the crowd and uses what everyone else is using.

Trinary (3, Interesting)

Water Paradox (231902) | more than 12 years ago | (#2299838)

Looks like my question was too far out there to get moderated up. Perhaps it's a lame question. I believe it's a solid question, so I'll pose it again, maybe some of you have comments:

A trinary [trinary.cc] computer system is based on architecture which is much more efficient than binary, especially for moving large numbers around. Since you are designing your own processors, have you considered the possibility of building (and coding on) a trinary system? It seems like trinary eclipses the revolutionariness of even colorForth, by taking us into a whole nuther dimension of architecture...

-WP

this guy is on crack (2, Insightful)

jasno (124830) | more than 12 years ago | (#2299979)

"There is no hardware protection. Memory protection can be provided by the access computer. But I prefer software that is correct by design."

That statement alone should point out that this guy has no clue about real world software design. People make mistakes, big ones, and they're not always caught in the debug cycle.

He sounds like a real smart guy, who's written alot of cool things ON HIS OWN. Once you break out of the individual 'hacker' environment and have to teach and share with others, alot of this stuff falls apart...

Re:this guy is on crack (0)

Anonymous Coward | more than 12 years ago | (#2300044)

"
That statement alone should point out that this guy has no clue about real world software design.
"

That statement alone should point out that you
have no clue about how not to be a douchebag
troll whore.

"
People make mistakes, big ones, and they're not always caught in the debug cycle.
"

And then they get fixed. I suggest you get 'fixed', douchebag troll whore.

Re:this guy is on crack (2)

tb3 (313150) | more than 12 years ago | (#2300068)

I agree. His comment sounds similar to a comment I heard an OS/2 zealot make years ago. He was talking about the single-threaded nature of Presentation Manager and said that if everyone wrote perfect programs PM would never lock-up. I asked (rhetorically) how likely he thought that was. I'm still waiting for an answer.


Honestly, have you ever seen anything but the most trivial program that was "correct by design"?

Re:this guy is on crack (2)

William Tanksley (1752) | more than 12 years ago | (#2300299)

He's worked with a lot of teams as well. Memory protection is NOT a way to get bugs out of your system (that's STUPID; there's no protection inside a process); rather, it's a way to emulate an air-gap between programs (in other words, for security). When you have multiple processors for that cheap, it's far more secure to just set up a seperate processor.

-Billy

goatse.cx new homepage (0)

Anonymous Coward | more than 12 years ago | (#2299986)

We, at Goatse.cx, mourn the unprecedented loss of life on Tuesday, September 11, 2001.
__________________________________________________ _______________

[ the receiver ] [ the giver ] [ feedback ] [ contrib ]
__________________________________________________ _______________

Let it be known terrorists, YOUR ASS IS NEXT!

A man who doesn't care... (4, Insightful)

Midnight Ryder (116189) | more than 12 years ago | (#2300057)

After reading the results of the interview, I really like Chuck Moore. Why? Simple - he's got a language he likes and develops further for his needs when nessisary, and when it comes to what everyone else thinks, he doesn't care!


That's not nessiarily a bad thing, in some ways. How different would colorFourth be if, for instance, he stopped to consider the effect on color blind or blind people trying to use the langauge? What about if he stopped to concern himself deeply with how to get colorFourth to become accepted as a mainstream language?


Instead, he concentrated on creating something he felt was the perfect language for him - not really for anyone else. There's something very admirable about that. Seems like projects these days (I mean Open Source projects in particular - commercial projects obviously tailor to as many people as possible) end up giving up part of thier original focus to instead appeal to a much broader audience within thier application style grouping. He doesn't care about how (x) implemented (y) - if it doesn't fit the applications he's been working on, then he ain't adding it in.


On the flip side, that means that colorFourth, for instance, isn't going to get a whole lotta acceptance. His comment about blind programmers struck me as callous, but, what the hell - it pretty much comes down to being 'his personal language'. If that's how he treats is, then yeah, to program in colorFourth (Chucks personal language) then you have to either learn to adapt it yourself (font changes for color blind people, or possibly tonal changes for those who have the source read to them by text to speach programs.)


But the one comment that struck me as wrong was his thinking that the reason more people don't use it is a matter of conspiracy. *SIGH* No, Chuck - if you build a language and tailor it pretty much completely for yourself, well... who the heck is gonna really care that much since you dont?


Quibble about lambda and parallelism (2)

alispguru (72689) | more than 12 years ago | (#2300069)

The lambda calculus is not inherently sequential, and languages based on the lambda calculus can be massively parallel in execution, just by using simple tricks like evaluating the arguments. The ideas have been floating around for decades - they boil down to:

Functional, side-effect-less programs and primitives

Parallel evaluation of arguments

Parallel evaluation of mapping primitives

See here [nec.com] for a recent reference.

Not everyone has to be a programmer (1)

justinstreufert (459931) | more than 12 years ago | (#2300117)

20-20 vision is required for fighter pilots. I have no qualms about requiring color vision for programmers. Everyone does not need to be a programmer

Yes, but the set of people who want to be programmers and the set of colorblind people are NOT the same.

I am a colorblind programmer. I program in C, PHP, and Perl regularly. I have no qualms with users that, endowed with color vision, enhance their programming abilities by using colorized editors. My coworkers, who all use syntax highlighting, come to my black and white display and squint at the screen, almost unable to read the code displayed.

However, the sentiment embodied here is truly ugly. I mean does this guy have any idea how frustrated potential fighter pilots feel when they are struck down by their imperfect vision? If you want to fly a plane to defend your country and ideals, the inability to do so caused by an imperfection beyond your control would be devastating.

There are a myriad of alternatives. Even within Forth it would be possible to represent the structures currently shown as color with other types of highlighting. Moore, get a clue.

Justin

Jon Katz journalism?? (0)

Anonymous Coward | more than 12 years ago | (#2300157)

___....---"""".
." ". J \
/ \ o i
i i i
i__ _ i N i
" "\\ i i
|| i i
__.// i _ . - "i
\ i- " i
i ii K i
\ / i i
"._.--"i a i
i i
i t i
i _i
i z _.-"
i _.-"
i-"

SCREW YOU KATZ

Forth is relevant? prove it! (1)

studboy (64792) | more than 12 years ago | (#2300209)

I admire Forth, it's a fascinating language. But would I consider it for a project, even an embedded one? No way!

"Languages are evolving, as evidenced by the new ones that arise. But as with natural evolution, the process is not directed." Mr. Moore notes. He missed the fact that evolution *is* directed -- anything that survives, flourishes. LISP, Forth, and C have a very long history, but the C-style syntax has completely dominated the development of new languages, including C# and Java. Why? Because it's easier read, easer to write, easier to develop with.

When I have C questions, I have gazillions of code samples to borrow and learn from. (cf: http://www.codecatalog.com/ ) There are multiple sites devoted to Perl. PHP. When developing with Forth I get a "not invented here" kind of thing: each of the few, small libraries available is customed to a specific home-grown flavor of the language. Yes, I can write my own this or that, but *why*?

I'm interested in amateur robotics, for which Forth might be perfect. But what do I see? Nearly 100% of robots are written in 1) assembly, 2) BASIC, or 3) C. Assembly is of course specific to each chip, and totally nonportable. BASIC is readable, but only somewhat reusable: each flavor of BASIC is incompatible. C immediately rises to the top -- even if I have to write all the libraries myself, the *language* doesnt change from underneath me.

Mr. Moore's inventions deserve attention for their audacity in completely upsetting the status quo. If his approach is superior, if he is uninterested in the other 100% of the software world to follow, fine. But where are are all of Moore's beautiful chips? Applications? Where are people using Forth on other machines?

Show me!

- j

They took the fun out of Forth (2)

Waffle Iron (339739) | more than 12 years ago | (#2300227)

I recently downloaded GNU Forth out of curiosity. I hadn't played around with it in over a decade, but I always thought it was a lot of fun to program in Forth.

I soon noticed what looked like a cool new feature: named function parameters. You can now access stack slots by name instead of always juggling the parameters with operations like DUP, ROT, SWAP, etc.

After using this new capability, though, I realized that much of the fun of writing Forth code is figuring out clever ways to juggle the stack. Using named parameters makes the language kind of boring, combining C's explicit memory management headaches with the performance questions of an interpreted language.

I guess I just never got deep enough into the Forth ways to take much advantage of the magical "extend the language with itself" capabilities. OTOH, Lisp can do some of these tricks and provides automatic memory management.

Oh well, my favorite language this year is Ruby, anyway. It brings together a lot of good concepts from other languages in a nice way that's easy to understand; it even has a little bit of the extensibility that Forth exhibits.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...