Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Ask Slashdot: Why Are We Still Writing Text-Based Code?

timothy posted about a year ago | from the because-there-are-only-so-many-lego-in-the-world dept.

Programming 876

First time accepted submitter Rasberry Jello writes "I consider myself someone who 'gets code,' but I'm not a programmer. I enjoy thinking through algorithms and writing basic scripts, but I get bogged down in more complex code. Maybe I lack patience, but really, why are we still writing text based code? Shouldn't there be a simpler, more robust way to translate an algorithm into something a computer can understand? One that's language agnostic and without all the cryptic jargon? It seems we're still only one layer of abstraction from assembly code. Why have graphical code generators that could seemingly open coding to the masses gone nowhere? At a minimum wouldn't that eliminate time dealing with syntax errors? OK Slashdot, stop my incessant questions and tell me what I'm missing." Of interest on this topic, a thoughtful look at some of the ways that visual programming is often talked about.

Sorry! There are no comments related to the filter you selected.

I think IBM is working on it (1)

SigNuZX728 (635311) | about a year ago | (#46191709)

With a project named "Watson". Or maybe not. I dunno.

Re:I think IBM is working on it (-1, Offtopic)

Anonymous Coward | about a year ago | (#46191851)

What does that have to do with BETA? Fuck BETA.

How about... (-1)

Anonymous Coward | about a year ago | (#46191867)


Re:How about... (-1)

Anonymous Coward | about a year ago | (#46192009)

And you know, fuck the EU.

Re:I think IBM is working on it (0)

Anonymous Coward | about a year ago | (#46192027)

Yeah... no...

Watson is a very long way from helping develop software, or even algorithms.

I know this one... (1, Offtopic)

djupedal (584558) | about a year ago | (#46191719)

It's because _we_ are still writing code. Self-written code that doesn't rely on wetware can't get here soon enough I think.

The more simple you make it the less complex it is (5, Insightful)

Anonymous Coward | about a year ago | (#46191731)

The reason programming languages are still as they are is for a simple reason, because you can't produce something complex with something simple, I.E. the more you simplify something the less control you have of it. Can a programming language be made that is not text based? Sure, but I highly doubt you are going to get the flexibility to do a lot of things. Even assembly is still required sometimes.

Re:The more simple you make it the less complex it (5, Interesting)

garyebickford (222422) | about a year ago | (#46191947)

This view is belied by the graphical tools used to design and layout hardware and chips. Higher level languages in particular are largely based on connecting the data flow between various pre-defined blocks or objects - function libraries.

  I actually built a primitive graphical Pascal pre-processor back in the late 1980s, which used the CMU SPICE circuit board layout program. Since the output of the program was text based, it could be processed into Pascal code. The model I used was that a function was a 'black box' with input and output 'pins', but also could be designed itself in a separate file.

I never actually finished it, but it was pretty workable as a programming paradigm, and opened up some new ways of looking at programs. For instance, a 3-D structure could be used to visualize formal structure (function calls, etc.) in one axis, data flow in another.

Also, the Interface Builder for the NeXT machine was more-or-less graphical, IIRC only 2-D. It made for very fast prototyping of a new user interface, and the 'functional' code could be put in later. (I saw a former schoolteacher, who had never used a computer until a few months before, demonstrate creating a basic calculator in Interface Builder in under 15 minutes. It worked, first time.)

I think the real issue is in large part a chicken-and-egg problem. Since there are no libraries of 'components' that can be easily used, it's a lot of work to build everything yourself. And since there is no well-accepted tool, nobody builds the function libraries.

Looking at this from a higher level, a complex system diagram is a visualization that could be broken down to smaller components.

In practice, I believe that the present text-based programming paradigm artificially restricts programming to a much simpler logical structure compared to those commonly accepted and used by EEs. For example, I used to say "structured programming" is essentially restricting your flow chart to what can be drawn in two dimensions with no crossing lines. That's not strictly true, but it is close. Since the late 1970s, I've remarked that software is the only engineering discipline that still depends on prose designs.

It's been done (4, Insightful)

Misanthrope (49269) | about a year ago | (#46191737)

If you have to understand the concepts anyways, why is text worse than a graphical set up? You can't really avoid learning syntax this way if you want to write anything actually complicated.

Also, fuck beta.

COBOL !! (0)

Anonymous Coward | about a year ago | (#46191739)

Grace said to !!

Lego Mindstorms (4, Interesting)

mrbluze (1034940) | about a year ago | (#46191741)

Try Lego Mindstorms and see whether you find it quicker or slower. It's easy to make something simple but once the algorithm gets complicated it is not much easier to decipher than text code, and no faster in my experience. As soon as you want to get serious with the system, you will wish it had a low level system that lets you lay it out in text instead of images.

This is partly the reason why surviving languages use symbols representing sounds rather than images as the Egyptians used. It's faster to write, and possibly faster to read.

Re:Lego Mindstorms (0)

Anonymous Coward | about a year ago | (#46191873)

>surviving languages use symbols representing sounds
over a billion people have a few symbols with you...

>algorithm gets complicated
that just means you need symbols to represent larger blocks. even electronics does this. nobody writes out amplifiers anymore, they use a single symbol now.

also, buck the feta.

Re:Lego Mindstorms (0)

Anonymous Coward | about a year ago | (#46191885)

Last I heard Chinese was still a surviving written language.

Re: Lego Mindstorms (0)

Anonymous Coward | about a year ago | (#46191969)

They use symbols to represent words.

Re:Lego Mindstorms (0)

Anonymous Coward | about a year ago | (#46191995)

Last I heard Chinese was still a surviving written language.

How many programming languages do you know that use Chinese script?

Re:Lego Mindstorms (2, Funny)

Anonymous Coward | about a year ago | (#46192037)

ChineseScript? None that I remember, but there is SumatraScrip...or HawaiiScript? Maybe JavaScript? (I don't remember)

Re:Lego Mindstorms (1)

garyebickford (222422) | about a year ago | (#46191973)

This is partly the reason why surviving languages use symbols representing sounds rather than images as the Egyptians used. It's faster to write, and possibly faster to read.

Ummm ... Egyptian hieroglyphics were actually phonetic symbols. And Chinese (still i use) is pictographic and not phonetic.

No thanks (1)

Dan East (318230) | about a year ago | (#46191749)

I don't think I'd like the alternative to text based coding. The only thing I can think of is some kind of fancy smancy IDE where you drag and drop stuff and build some kind of visual flowchart. Those kinds of things are out there for people who don't like to type (or produce real software for that matter).

Re: No thanks (0)

Anonymous Coward | about a year ago | (#46191883)

as a programmer, I know I am longing for the day when I can ditch my keyboard and reprogram something by removing a side panel and rearranging the underlying plastic cards inserted into slots... just like they always did on the Enterprise in Star Trek TNG. They always managed to get the expected result with no testing required!

Because people write text (5, Insightful)

Anonymous Coward | about a year ago | (#46191751)

This is a rhetorical question. It would be similar to ask "why do we write books or manuals when we can just record a video"

The answer is written words is how we communicate and record such communication as a civilization. Written communication is easy to modify and requires little space to store. And this is just scratching the surface, not touching things like language grammar or syntax, etc.

BASICally This (0)

Anonymous Coward | about a year ago | (#46191755)

05 REM Slashdot comment
10 CLS
15 PRINT "Back In the Bad Old Days you had to re-arrange cords and switches to re-program a computer."
20 PRINT "Then Came Text Based Code and everyone had a gay old time programming."
25 PRINT "Then 30 motherfucking years after the advent of the Macintosh you asked this question"
30 PRINT "They should have came up with this years ago."
35 GOTO 10

Power. (1, Insightful)

Anonymous Coward | about a year ago | (#46191757)

Words have power. They abstract complex ideas. One word and you have an image in your head. We don't think in terms of code, we think in terms of pictures. Text abstract it.

Re:Power. (-1)

Anonymous Coward | about a year ago | (#46191909)

If, by "we", you mean "all of the retards like YOU", then ok.

You don't speak for the intelligent people in society, though.

Re:Power. (1)

Mitchell314 (1576581) | about a year ago | (#46191989)

This is slashdot, not society, so OP's point stands.

Re:Power. (2)

garyebickford (222422) | about a year ago | (#46192035)

IMHO that's just historic, mostly. EEs have been designing circuits with structural complexity at least as great as any software program, using graphical tools, all along. Early on (after the plugboard era) computers didn't have the horsepower or graphic capability to do software CAD, and so programmers got started using prose of necessity. It's quite possible that as a result, programmers have tended to be non-graphical people (viz. the folks that hate using X-windows, or all that "GUI crap".) Now getting a graphical language frontend to be as popular would require replicating a lot of existing work in the new domain - for instance a graphical function library front end that presents all of the C function library as 'chips'. And then you'd still have to either convince all those linear, text-based programmers to change, or start over with the new generation.

I believe that a new generation will in fact start over using a graphical system sometime in the near future. There are things you can see and understand in three dimensions that just don't show up in prose.


FUCK BETA, FUCK DICE (3529333) | about a year ago | (#46191761)

Please post this to new articles if it hasn't been posted yet. (Copy-paste the html from here [pastebin.com] so links don't get mangled!)

On February 5, 2014, Slashdot announced through a javascript popup that they are starting to "move in to" the new Slashdot Beta design. Slashdot Beta is a trend-following attempt to give Slashdot a fresh look, an approach that has led to less space for text and an abandonment of the traditional Slashdot look. Much worse than that, Slashdot Beta fundamentally breaks the classic Slashdot discussion and moderation system.

If you haven't seen Slashdot Beta already, open this [slashdot.org] in a new tab. After seeing that, click here [slashdot.org] to return to classic Slashdot.

We should boycott stories and only discuss the abomination that is Slashdot Beta until Dice abandons the project.
We should boycott slashdot entirely during the week of Feb 10 to Feb 17 as part of the wider slashcott [slashdot.org]

Moderators - only spend mod points on comments that discuss Beta
Commentors - only discuss Beta
http://slashdot.org/recent [slashdot.org] - Vote up the Fuck Beta stories

Keep this up for a few days and we may finally get the PHBs attention.

-----=====##### LINKS #####=====-----

Discussion of Beta: http://slashdot.org/firehose.pl?op=view&id=56395415 [slashdot.org]
Discussion of where to go if Beta goes live: http://slashdot.org/firehose.pl?op=view&type=submission&id=3321441 [slashdot.org]
Alternative Slashdot: http://altslashdot.org [altslashdot.org] (thanks Okian Warrior (537106) [slashdot.org] )


Garble Snarky (715674) | about a year ago | (#46191797)

What's the big deal? Who cares?

Ask Slashbeta: Why do you suck so badly? (-1)

Anonymous Coward | about a year ago | (#46191765)

Because I am Beta, that is the purpose of my existence. To ruin lives and bring unhappiness into this fragile world.

Re:Ask Slashbeta: Why do you suck so badly? (-1)

Anonymous Coward | about a year ago | (#46191827)

So you're this beta woman we are supposed to be fucking?

Labview (5, Insightful)

Anonymous Coward | about a year ago | (#46191767)

Because visual programming is even more awkward in almost any aspect (see Labview).It takes significantly longer to write, large projects are all but impossible. There is a reason why circuits are not designed anymore by drawing circuits (in most cases anyway)

Re:Labview (4, Interesting)

Garble Snarky (715674) | about a year ago | (#46191785)

I'm stuck on a several-month-long Labview project right now. It's been a terrible experience. I don't know if it's more because of the poorly designed editor, the language itself, or the visual language paradigm. But I'm sure all three of those are part of the problem.

Re:Labview (3, Informative)

ArchieBunker (132337) | about a year ago | (#46191957)

I use Labview all the time and it does exactly as advertised. I'm a hardware guy but occasionally need things done in software. Sure its not optimal but it gets the job done.

How are circuits designed today if they are not drawn?

Re:Labview (3, Informative)

tftp (111690) | about a year ago | (#46192057)

How are circuits designed today if they are not drawn?

They are synthesized by XST, Synplify Pro, or a similar tool.

Slashcott Feb. 10-17!

Re:Labview (1)

Anonymous Coward | about a year ago | (#46192063)

I have used Labview since 1994. It most definitely does everything it claims and more. I also program in assembly (MPASM), C, C++ but in Labview I can generate more proven code in far less time.

My only problem with it is that the Labview Development Environment is all proprietary. An Open-Source Development Environment would be nice, even if it didn't have the multitude of functions. As long as it had .vi and .llb compatibility

Re:Labview (1)

bunratty (545641) | about a year ago | (#46192043)

I did a small project in Simulink (part of MATLAB) which uses graphical-based programming. It was also quite tedious. Writing text files seems so much easier to me. I wish all system administration was based on simply editing text files... if something wasn't working I could just look at the code and see what is wrong, rather than typing obscure queries to try to determine the current settings.

Re:Labview (1)

garyebickford (222422) | about a year ago | (#46192045)

There is a reason why circuits are not designed anymore by drawing circuits (in most cases anyway)

AFAIK they're certainly not designed in prose. What about all those VLSI and layout CAD systems?

Text-based books (5, Insightful)

femtobyte (710429) | about a year ago | (#46191771)

Why are we still writing text-based books, and communicating in word-based languages? Surely, we should have some modern, advanced form of interpretive dance that would make all such things obsolete. Wait, that's a terrible idea! Text turns out to be a precise, expressive mode of communication, based on deep human-brain linguistic and logical capabilities. While "a picture is worth a thousand words" for certain applications, clear expression of logical concepts (versus vague "artistic" expression of ambiguous ideas) is still best done in words/text.

April 1st isn't for a few more months (4, Informative)

t0qer (230538) | about a year ago | (#46191773)

I think the /. folks think it's an early April Fools day. Not write code using text? That's like saying, write a book with pictures. Sure it can be done, but it doesn't apply to all books.

Maybe beta is an early April Fools joke too.

No (0)

Anonymous Coward | about a year ago | (#46191777)

You'll like the Slashdot Beta then.

When I was taking a C class at a community college, the instructor gave up half way through and just started putting buttons on windows in Visual C++, thus ensuring that the students wouldn't know what all the underlying code behind those shiny objects did.

Re:No (-1)

Anonymous Coward | about a year ago | (#46191963)

Slashdot has gone to complete shit.

It attracts no one but retarded idiots like the submitter.

No one with a functional brain would think a graphical computing language would be superior.

At least this is just becoming one less place for me to point my browser. With any luck, I'll be done with the retards on the Internet and back to creating real things outside again soon.

Fuck you all.

Church of Pain (4, Funny)

Moblaster (521614) | about a year ago | (#46191783)

Well, Grasshopper, or Unschooled Acolyte, or whatever your title of choice may be...

You did not hear this from me.

But most developers belong to the Church of Pain and we pride ourselves on our arcane talents, strange cryptic mumblings and most of all, the rewards due the High Priesthood to which we strive to belong.

Let me put it bluntly. Some of this very complicated logic is complicated because it's very complicated. And pretty little tools would do both the complexity and us injustice, as high priests or priests-in-training of these magical codes.

One day we will embrace simple graphical tools. But only when we grow bored and decide to move on to higher pursuits of symbolic reasoning; then and not a moment before will we leave you to play in the heretofore unimaginable sandbox of graphical programming tools. Or maybe we'll just design some special programs that can program on our behalf instead, and you can blurt out a few human-friendly (shiver) incantations, and watch them interpret and build your most likely imprecise instructions into most likely unworkable derivative codes. Or you can just take up LOGO like they told you to when you were but a school child in the... normal classes.

Does that answer your impertinent question?

because higher abstractions are hard (0)

bob_super (3391281) | about a year ago | (#46191789)

Try to run this kind of code in your CLI:
compile linux -ideal_desktop $my_needs
display slashdot_UI -ad_revenue -comment_friendly -no_backlash

What result did you get?
You can find pretty high abstractions for very limited subsets (Matlab algorithms, video filter IP in an FPGA), but it's near-impossible to get a generic compiler to guess what you really want if your code is too abstract.

Ongoing projects (-1)

Anonymous Coward | about a year ago | (#46191793)

i hear onda technology [ondatechnology.org] is also working on it
can't find a reference on their website though, but a few recent screenshots have already been around the tubes

Sure thing (5, Funny)

Tough Love (215404) | about a year ago | (#46191795)

Sure, and similarly, laws should not be written down in legal language, they should be distributed in comic book form.

FUCK BETA (-1, Offtopic)

Anonymous Coward | about a year ago | (#46191801)

On February 5, 2014, Slashdot announced through a javascript popup that they are starting to "move in to" the new Slashdot Beta design.

Slashdot Beta is a trend-following attempt to give Slashdot a fresh look, an approach that has led to less space for text and an abandonment of the traditional Slashdot look. Much worse than that, Slashdot Beta fundamentally breaks the classic Slashdot discussion and moderation system.

If you haven't seen Slashdot Beta already, open this in a new tab. After seeing that, click here to return to classic Slashdot.

We should boycott stories and only discuss the abomination that is Slashdot Beta until Dice abandons the project.

We should boycott slashdot entirely during the week of Feb 10 to Feb 17 as part of the wider slashcott

Moderators — only spend mod points on comments that discuss Beta

Commentors — only discuss Beta

http://slashdot.org/recent [slashdot.org] [slashdot.org] — Vote up the Fuck Beta stories

Keep this up for a few days and we may finally get the PHBs attention.

Because text is the only medium that's varied enou (3, Insightful)

machineghost (622031) | about a year ago | (#46191803)

There have been LOTS of attempts at "visual code", and they all look great when you watch the 10 minute presentation on them, but when you actually try to use them you find that they all solve a very small set of problems. Programmers in the real world need to solve a wide variety of problems, and the only medium (so far) that can handle that is text code.

It's like saying "why don't we write essays in pictograms?" You might be able to give someone directions to your house using only pictograms (and street names), but if you want to discuss why Mark Twain is brilliant, pictograms just don't cut it: you need the English (or some other) language.

Ugh... (1)

Anonymous Coward | about a year ago | (#46191809)

why are we still writing text based code?

Because that's the most efficient way of doing it.

Shouldn't there be a simpler, more robust way to translate an algorithm into something a computer can understand?

No. Because the problem isn't with how you communicate with the computer, it is in changing your way of thinking to be specific enough that the computer has all the information needed and knows exactly what to do with it. Computers don't think and they have no concept of anything but numbers. One way or another, you're going to have to word things how the computer likes it.

Why have graphical code generators that could seemingly open coding to the masses gone nowhere?

Because 1) They are slow as fuck and 2) They aren't as open-ended and allow you to build whatever you want and 3) You still need to design around what the computer wants to accept. Over-engineering something leads to loss of control and robustness.

At a minimum wouldn't that eliminate time dealing with syntax errors?

Syntax errors are a non-issue. You could very well make the same type of mistake in any kind of code-editor, but that aside, it takes, what, 2 seconds to fix a syntax error? And the compiler is going to tell you exactly what's wrong.

One practical example (3, Interesting)

TheloniousToady (3343045) | about a year ago | (#46191811)

One practical example that I know of is Simulink, which can be used to generate code from diagrams. I did some testing years ago on Simulink-generated source code, and the code itself was awful looking but always worked correctly. Not a lot of fun to test when you had to dig into it, though. Also, testing seemed superfluous after never finding any bugs in it. All the bugs we ever found were in the original Simulink diagrams that the humans had drawn.

if you "get coding" so well, why arent you coding? (4, Insightful)

dagrichards (1281436) | about a year ago | (#46191813)

You may believe that you 'get code'. But clearly you do not. there have been more than a few attempts to make common objects flexible enough so that even you can stack them on top of each other to make applications. They are unwieldy and create poorly performing applications.

Re:if you "get coding" so well, why arent you codi (1)

khchung (462899) | about a year ago | (#46191977)

You may believe that you 'get code'. But clearly you do not.

This was also my first thought when reading the summary. His question already proved that he doesn't "get code" at all. \

What do you mean by text? (3, Funny)

Anonymous Coward | about a year ago | (#46191815)

Does APL [wikipedia.org] suffice?

Why don't all cars get 100mpg? (2)

Sam36 (1065410) | about a year ago | (#46191823)

I mean really, it is like the 21st century. Why can't cars get better gas milage? I am not an engineer, or very good at physics, but it seems to me that we can do better than this.

Dumbass submitter... (-1)

Anonymous Coward | about a year ago | (#46191825)

Why are we still communicating in text? Why don't we just draw our our conversations or user hieroglyphics?

LANGUAGE is captured and expressed best, aside from audibly, in the WRITTEN WORD.

If you're such a dumbass that you need a picture book to understand a concept, then your ideas are not worth sharing with the rest of the world.

A painter can create an image, but a writer can create a universe.

A painter can show you his imagination, but a writer can guide yours.

Basically, you're an idiot for even asking such an asinine question.

Please never try to communicate with anyone ever again.

Because it is classic (4, Funny)

transporter_ii (986545) | about a year ago | (#46191839)

And why should you change if what you had worked great. I'm not against change, just as long as it is change for the better. If they came out with some new snazzy looking way to write code, but everyone said it sucks...but the old way worked just fine...then freaking stick with the old way. Unless you just don't care about actually making writing code better. Now who in their right mind would want to change something just to make it worse?

ssis (0)

Anonymous Coward | about a year ago | (#46191847)

Try SSIS you idiot and see how far it gets you with its visual language.

Already done for many yrs - MBD (1)

postmortem (906676) | about a year ago | (#46191849)

Model based development.
You design the model, simulink makes the code.
http://www.mathworks.com/produ... [mathworks.com]

Not happening. (0)

Anonymous Coward | about a year ago | (#46191853)

Translating an algorithm into an abstraction isn't the problem. Functions do that and objects do that, with different levels of fit for different tasks. I don't need to reimplement a linked list, I can just use the linked list object my library provides most of the time. If I do have to implement an algorithm, it should be abstracted into helper functions or objects.

So a language-agnostic way of representing algorithms in a non-text form seems kind of a weird request. Instead, a non-text language seems more promising.

The problem there is that text is actually pretty good. I've tried things like Racket and they bug the hell out of me. I like the idea of Smalltalk images - being able to save a VM image of where I currently am in the development and debugging process. But it doesn't seem to work out. Text is actually an incredibly good way of storing programme logic. It's efficient, easily editable in a wide number of applications, easily portable between systems.

The closest I've seen to a good visual programming tool is something like Yahoo! Pipes. And that's pretty much only good at handling exceptionally trivial data transformations. A good high-level language with well-written libraries would let you turn out much more readable code to do the same thing.

You can version control text, you can diff text, you can use the Git/svn blame tool to see who fucked things up.

because (2)

niff (175639) | about a year ago | (#46191857)

Just try to write a quicksort routine using any non-text-based programming language.

Or try to describe a 10 line shell script using UML.

You'll find out that text-based code is actually quite efficient.

You must not actually "get coding" like you say. (2)

darkshot117 (1288328) | about a year ago | (#46191865)

You mention that you aren't a programmer, and it's obvious by the questions you ask. There's a reason why all of the attempts to create non-text based programming languages have failed and people revert back to text code. Because text code simply works best. If some day we can plug a computer into our brain to convert our thoughts into computer code, then I'd prefer to write code that way. But until then, the best way for us to get our thoughts and ideas into code is to write it out.

PureData (2)

jblues (1703158) | about a year ago | (#46191877)

There are some graphical programming languages - PureData is one. Quick summary: Pd enables musicians, visual artists, performers, researchers, and developers to create software graphically, without writing lines of code. Pd is used to process and generate sound, video, 2D/3D graphics, and interface sensors, input devices, and MIDI. I used it recently on a project and really enjoyed it. (Used it to remix the Australian Chamber Orchestra in real-time on iPads). Other folks are doing fantastic things with it too - check out the Rj Voyager App. Although Pd is a turing complete language, I wouldn't want to write anything but specialized applications with it. Text-based OO is just so fluent and fast.

Try it, you'll hate it (0)

Anonymous Coward | about a year ago | (#46191887)

I inherited a large program written in LabView. It is a visual programming environment that is good for making virtual instruments. However, it was, in my experience, very hard to document, search, etc. Making even small modifications was impossible and we abandoned changing it. Instead we wrapped the parts we couldn't replace with C code and called the C from a scripting language. I have also had experience with other very large code bases written in cryptic languages such as Fortran 4. It was much easier to find out where to make changes. Tools like grep, diff, etc., all apply. Maybe this is because I already know text tools, but I think there is a reason that those kind of visual programming environments haven't really caught on.

There is programming without code. (2)

darkwing_bmf (178021) | about a year ago | (#46191889)

Programming is nothing more than telling a machine what to do. You can tell your car to start by turning a key. You can tell the light to switch off by flipping a switch. You can even etch your own circuit board without typing anything. However, using actual words to code is much easier than designing an entire system from chemical and mechanical processes. That's why we have programming languages. Despite popular perceptions, those languages significantly simplify complex tasks.

Relevant xkcd (0)

Anonymous Coward | about a year ago | (#46191891)

Panel #2

not so bad... (0)

Anonymous Coward | about a year ago | (#46191897)

I started coding using Pure Data and Max/MSP over a decade ago, and have built large and complex pieces of software with them. They are great for prototyping and communicating ideas to those unfamiliar with text based code. They also allowed text based code inside the flow chart with little built in python and java widgets, with their native objects are written in C. However, they both hog resources and their code is barely portable from one version to the next. This is the reason why they essentially fail in the long run.

Dataflow languages? (1)

fatphil (181876) | about a year ago | (#46191899)

They're inherently graphicalisable, as long as all of the building blocks fit inside nice small rectangles.

Having said that - ``|'' and ``>'' are graphical. So probably all shell programming is in part visual.

Having said that, that the fuck am I on?!?!?!? I write perl, and perl looks like ASCII art, it certainly doesn't look like text.

Mod down story, story bollocks.

Fuck beta.

Room for future expansion (0)

j-stroy (640921) | about a year ago | (#46191903)

Leaving room for this kind of design in our architectures is a great idea.. for now tho, it would appear as whitespace.

Because the alternatives are worse (4, Insightful)

umafuckit (2980809) | about a year ago | (#46191905)

There are "visual" (non-text) languages out there and they're not very nice. A major proprietary one is LabVIEW [ni.com] , which mainly used for data acquisition and instrument control (hence the name). This is what the code might look like [unm.edu] . Developing small applets in LabVIEW is very fast, but things get horrible as the project gets larger. LabVIEW issues include:
  • Hard to comment
  • Very easy to write bad code (particularly for beginners)
  • Version control is awkward
  • Clunky to debug because programs are hard to follow.
  • Hard to modify existing code
  • Coding becomes an exercise in placing the mouse in just the right places and finding the right little block.
  • As a beginner you waste lots of time trivialities instead of actually learning to code.
  • Hard to learn from a book or even from reading somebody else's code.
  • Documentation is crappy.

Graphical languages are still programming. Syntax errors don't go away, they just manifest themselves differently. I don't think graphical languages really solve any problems, they just create new ones. That's why they haven't caught on.

Re:Because the alternatives are worse (-1)

Anonymous Coward | about a year ago | (#46192039)

LabVIEW is complete shit. The worst part is that it's used by people who are too retarded to code in text-based languages, so you KNOW they're going to fuck everything up from the start.

Because you don't know the algorithm (1)

iMadeGhostzilla (1851560) | about a year ago | (#46191907)

In my experience often I start with what I think is the algorithm that will solve the problem, and then I discover that there are nuances in the real world that make my original algorithm/idea/flow inadequate, and that requires refinements, iteratively, until I get the right algorithm. Now this kind of nuanced tweaks requires very nuanced tools, and nothing is more suitable than text. Certainly some pictures with arrows wouldn't cut it.

On a related note, I checked to see what that beta thing was about and I instantly hated it. It has as much appeal as seeing a mobile version of a site on my HTML5-capable phone.

Because (1)

bhcompy (1877290) | about a year ago | (#46191911)

Because Dreamweaver and Frontpage were awful and no one dares revisit it

Wow, what a brilliant idea (2)

sunking2 (521698) | about a year ago | (#46191913)

If someone ever comes up with such a thing I have the perfect name for it. MatrixX or perhaps Matlab.

I don't understand what the beta is meant to fix? (-1)

Anonymous Coward | about a year ago | (#46191915)

I don't understand what the beta is meant to fix?

its hard on the eyes/hard to read.

the classic slashdot looks so much better.

What about the slashNOT beta? (0)

Life2Death (801594) | about a year ago | (#46191921)

Its so rad and DIGG like, but who cares? Not the community! FUCKERS

The problem is not... (3, Insightful)

MpVpRb (1423381) | about a year ago | (#46191923)

..text vs "something-else-that-isn't-text"

The problem is complexity

Programs are getting too complex for humans to understand

We need more powerful tools to manage the complexity

And no, I don't mean another java framework

Are you kidding? (1)

David Betz (2845597) | about a year ago | (#46191927)

Seriously. None of us wants to draw out expression trees for every single thing we do. Why not just get rid of ALL math notation while we're at it!? Not not kill all language and communicate in internalized thought-vectors? Seriously, are you kidding me? Is it April 1st somewhere? Going for the troll of the year award?

Not one layer, several layers. (0)

Anonymous Coward | about a year ago | (#46191931)

If you think that the current state of programming languages is only "one layer of abstraction from assembly code", then you need to pick up K&R and learn some C.

It is a symptom of the industry and human nature (3, Interesting)

maple_shaft (1046302) | about a year ago | (#46191935)

There have been a number of attempts at making coding easy enough that non engineering types will be able to conceive their requirements in software then communicate these through a tool, usually in a visual manner and turns this into functional software. This has come in many different forms over the years, Powerbuilder, FoxPro, Scratch, BPEL, etc...

The fundamental flaw is one of the software development industry, especially when it comes to line of business applications. Analysts writing requirements have been and always have been an inefficient and flawed model as most requirements documents are woefully incomplete and tend to not capture the true breadth of necessary functionality that ends up existing in resultant software. Analysts are business oriented people and they will think about the features and functionality that are most valuable and tend to miss or not waste time on what are deemed as low value or low risk items. Savvy technical folks have needed to pick up the slack and fill in the gaps with non-functional requirements (Architecture) or even understand the business better than the analysts themselves for quality software for the business to even be realized.

I have seen this song and dance enough. True story, IBM sales reps take some executives to a hockey game, show them a good time, tell them about an awesome product that will empower their (cheap) analysts to visualize their software needs so that you don't need as many (expensive) arrogant software engineers always telling you no and being a huge bummer by bringing up pesky "facts" like priorities and time. So management buys Process Server, snake oil doesn't do it justice, without consulting anybody remotely technical. Time passes, and analysts struggle to be effective with it because it forces them to consider details and fringe cases. Software engineers end up showing them how to use it, at which point it just becomes easier for the software engineer to just do the work instead of holding hands and babying the analysts all day. Now your company is straddled with a sub par product that performs terribly, that developers hate using, that analysts couldn't figure out and that saved the company no money.

The best way to find out . . . (1)

Mitchell314 (1576581) | about a year ago | (#46191937)

. . . is to attempt to make a large project in a visual language yourself.

Command line vs GUI (1)

asmkm22 (1902712) | about a year ago | (#46191939)

Same deal with command lines and GUI's. A GUI may seem easier to learn and use, and it is for a lot of things, but you still have to use the command line in order to start getting into the more powerful features, like piping. Here's a quick example for Windows users:

Easiest way to see your IP address?

A. Depending on your OS version, you might be able to just double-click the connection icon (if it's there) and then click to see the details. Or you might get taken to the Network and Sharing Center where you have to dig around for the device adapter.


B. You can hit Win+R, type "cmd" and hit enter. Type "ipconfig" and view everything there. If you need details, you can do "ipconfig /all". Or, you could just type:

ipconfig | find "IPv4"

Option B might seem more complicated, but it's faster, more consistent, and only takes a few times to get used to.

Alternatives? (1)

rossdee (243626) | about a year ago | (#46191943)

why are we still writing text based code?"

Have you ever tried programming in APL ?

(And then tried reading the code later?)

At a high level of abstaction will work-UFKC ATEB (1)

mrpacmanjel (38218) | about a year ago | (#46191951)

This kind of thing is okay for simple tasks but can get complicated very quickly.
Maybe at a high level of abstraction this works but the inner details has to be coded in text.

UFKC ATEB https://scontent-b-lhr.xx.fbcd... [fbcdn.net]

It's because you get bogged down (4, Informative)

Todd Knarr (15451) | about a year ago | (#46191961)

So-called "visual programming", which is what you're wanting, is great for relatively simple tasks where you're just stringing together pre-defined blocks of functionality. Where you're getting bogged down is exactly where visual programming breaks down: when you have to start precisely describing complex new functionality that didn't exist before and that interacts with other functionality in complex ways. It breaks down because of what it is: a way of simplifying things by reducing the vocabulary involved. It's fine as long as you stick to things within the vocabulary, but the moment you hit the vast array of things outside that vocabulary you hit a brick wall. It's like "simplfying" English by removing all verb tenses except simple past, present and future. It sounds great, until you ask yourself "OK, now how do I say that this action might take place in the future or it might not and I don't know which?". You can't, because in your simplification you've removed the very words you need. That may be appropriate for an elementary-school-level English class where the kids are still learning the basics, but it's not going to be sufficient for writing a doctoral thesis.

Look at RpgMaker (4, Informative)

elysiuan (762931) | about a year ago | (#46191965)

Kind of a weird example but RpgMaker is a tool that lets non-programmers create their own RPG games. While there is a 'text based code' (ruby) layer a non-programmer can simply ignore it and either use modules other people have written or confine their implementation to the built in functionality.

Now look at the complexity involved in the application itself to enable the non-programmer to create their game. Dialog boxes galore, hundreds of options, great gobs of text fields, select lists, radio buttons. It's just overflowing with UI. And making an RPG game, while certainly complex, is a domain limited activity. You can't use RpgMaker to make a RDBMS system, or a web framework, or a FPS game.

The explosion of UI complexity to solve the general case -- enable the non-programmer to write any sort of program visually-- is stupendously high. WIth visual tools you'll always be limited by your UI, and what the UI allows you to do. Also think of scale, we can manage software projects with text code up to very high volumes (it's not super easy, but it's doable and proven). Chromium has something like 7.25 million lines of code. I shudder to think how that would translate into some visual programming tool.

I'm not sure how well it would scale

Graphics doesn't scale well (2)

saccade.com (771661) | about a year ago | (#46191967)

Graphical programming languages were a popular PhD topic 25-30 years ago. You can find them today in systems targeted at kids or non-technical users. But you won't find them anywhere near serious software development. Text is an incredibly dense and powerful medium for communicating with machines. The problem with graphics for programming is they do not scale well. Consider a moderately complex problem, solved in, say, several thousand lines of code. The same thing expressed graphically starts using dozens of pages (or bubbles, or nodes or whatever graphics) to express the same thing. It gets ugly quick.

Several years ago, I did the side by side experiment of expressing the same non-trivial digital circuit (a four digit stopwatch with a multiplexed display) as both a schematic diagram, and as text with Verilog. The graphic (schematic) version was much more time consuming, and *much* harder to modify than the text-based Verilog. It became very clear why digital circuit designers abandoned graphics and switched text for complex designs.

vvvv (0)

Anonymous Coward | about a year ago | (#46191971)

Hi this a nice visual programming tool (just for Windows...) http://vvvv.org

Visual Programming Has Been a 20-Year Failed Exper (2)

Calen Martin D. Legaspi (3529525) | about a year ago | (#46191979)

Visual code generators have existed for two decades, most famous is the Rational product. I've never met a developer or read an unbiased article claiming that the code generators have helped. Usually they say it just leads to ugly code and high maintenance overhead to maintain the diagrams. In natural language, why haven't photos and videos replaced words? It's because words are still the best way to express precise and complex logic. Now, it's up to the writer to express complex logic in a series of simple steps that a reader can understand, or to write in a convoluted way.

We're not 1 layer above assembly (1)

viperidaenz (2515578) | about a year ago | (#46191983)

We're two.
That's already enough.
There are plenty of '4GL' languages out there, where you draw diagrams and 'write no code' but they're very limited in what you can do. You can only do what the language and tool designers have already thought of. As soon as you need to do anything more complex. you need a write code.

If you want to try it, try it (1)

Sycraft-fu (314770) | about a year ago | (#46191985)

Labview is visual code. As another poster mentioned, you can try Mindstorms NTX, which is a cut down version of Labview. It really isn't any easier though, at least for anything of any real complexity.

Work on your attention issues (0)

Anonymous Coward | about a year ago | (#46191987)

I don't know what you mean by "I get code" if you can't write it. You say you lack patience. Well, that's only going to get worse if you're confined to a visual (rather than textual) interface. The structure of the program should be in your mind. A UI can't relieve you of that.

Don't take your lack of patience for granted. Work on it. Take your time to reflect what you're doing.

As Simple As Possible, No Simpler (4, Insightful)

Bob9113 (14996) | about a year ago | (#46191991)

Most of the unnecessary parts of code are there for clarity, to make the code less cryptic. Most of the cryptic stuff is cryptic because it has been condensed. Consider iterating with a counter:

for $i in ( 1..100 )

That's about as concise as it can possibly be, and still get the job done. Most languages get a little more verbose, to add specificity and clarity:

for ( int i = 1; i <= 100; i++ )

That specifies the type of the holder (int), that it should use include i=100 as the final iteration, and it explicitly states that i should be increased by 1 each time through. That's just a tiny example, but that is how most code is. It is as simple as possible, without becoming too noise-like, but no simpler. Some langauges, like PERL, even embrace becoming noise-like in their concision.

As for doing it with pictures instead of text, we try that every five or ten years. GUI IDEs, MDA [wikipedia.org] , Rational Rose [visual-paradigm.com] , UML [wikipedia.org] , etc (there's some overlap there, but you get the picture).

I suspect the core problem is that code is a perfect model of a machine that solves a problem. The model necessarily must be at least as complex as the solution it represents. That could be done in pictures or with text glyphs. Why are text glyphs more successful? I'm guessing it is because we are a verbal kind of animal. Our brains are better adapted to doing precise IO and storage of complex notions with text than with pictures. It's also faster to enter complex and precise notions with the 40 or 50 handy binary switches on a keyboard than with the fuzzy analog mouse. But at this point I'm just spitballing, so on to another topic:

Fuck beta. I am not the audience, I am one of the authors of this site. I am Slashdot. This is a debate community. I will leave if it becomes some bullshit IT News 'zine. And I don't think Dice has the chops to beat the existing competitors in that space.

Simple: Text is most expressive (3, Informative)

gweihir (88907) | about a year ago | (#46191993)

All other potential "interfaces" lack expressiveness. Just compare a commandline to a GUI. The GUI is very limited, can only do what the designers envisioned and that is it. The commandline allows you to do everything possible.

So, no, we are not "still" using text. We are using the best interface that exists and that is unlikely to change.

Are they PAYING people to post non-beta comments? (0)

Anonymous Coward | about a year ago | (#46192001)

I hate beta.

Efficiency (2)

kifter (3526797) | about a year ago | (#46192021)

Using a keyboard is the most efficient way for me to create programs. My fingers are the fastest appendages I own. I assimilate keyboard shortcuts into my daily patterns to avoid using the mouse(hand-based:slower); the visual aspects of software engineering that typically require mouse movement take more time to execute. If it were more efficient to write the programs I need to write with a visual, mouse-based, or gesture-based editor, I would. Hands-On-Board

The answer is simple... (1)

3seas (184403) | about a year ago | (#46192023)

.... the baseline is wrong... it needs to be corrected and then you can have better interfaces for coding, or auto coding http://abstractionphysics.net/... [abstractionphysics.net]

Known issue... (1)

frank_adrian314159 (469671) | about a year ago | (#46192025)

Text is more dense than graphics. Graphics, depending on how it's laid out can provide better or worse documentation than the code itself. Debuggers run on text, not pictures. Text can be edited with a variety of tools; any graphical notation will have its own editor, which will probably suck. Big diagrams are too tangled, while small diagrams provide too little context or require enough off-page connectors to make any intent opaque. No good way to connect to globals without use of (again) said off-page connectors. Need I go on?

People have been trying out and discarding graphical programming interfaces since the early 1970's (at least). Nobody keeps using them. Get the clue.

Plus, fuck beta.

Shouldn't there be a simpler, more robust way? (1)

Culture20 (968837) | about a year ago | (#46192029)


Why? Got something better? (1)

khellendros1984 (792761) | about a year ago | (#46192049)

If I need to write a large piece of software, there's a certain irreducible complexity that will be involved. There's a point where simplifying the program any further would require dropping features from the program.

In the end, you still need to specify behavior. I don't think that it matters if the program is shown represented as some kind of image, control blocks arranged visually, a flowchart, text, or anything else, the problems of writing complex software will be the same; you need to organize the program logically, maintain its internal state somehow, take input, and produce results.

Say that I write some 100-class project in Python (so, a fair-sized system, but not *too* huge). I could come up with a visual metaphor for everything in the program, maybe with ways to assign pictures to variables and functions that (somehow) give a language-free representation of the meaning of that variable/function. It's still going to have 100 classes in it, the same effective logic in its design, but now, I'd have no idea how to search for a specific function in it. The thought that I'm left with is that yes, it would be cool to have a magic pill that made programming easy for everyone to do, but I'll believe it when I see it. I've never seen a non-text-based way to produce a computer program that could represent an algorithm in a way that wouldn't look like a horrific visual jumble for a non-trivial algorithm.

Language is the answer to your question... (5, Insightful)

necro351 (593591) | about a year ago | (#46192061)

...and I do not mean programming language, though that can help.

There is not a big gain (any gain?) to seeing a square with arrows instead of "if (a) {b} else {c}" once you get comfortable with the latter. I think you hinted at the real problem: complexity. In my experience, text is not your enemy (math proofs have been written in mostly text for millennia) but finding elegant (and therefore more readable) formulations of your algorithms/programs.

Let me expand on that. I've been hacking the Linux kernel, XNU, 'doze, POSIX user-level, games, javascript, sites, etc..., for ~15 years. In all that time there has only been one thing that has made code easier to read for me and those I work with, and that is elegant abstractions. It is actually exactly the same thing that turns a 3--4 page math proof into a 10--15 line proof (use Louisville's theorem instead of 17 pages of hard algebra to prove the fundamental theorem of algebra). Programming is all about choosing elegant abstractions that quickly and simply compose together to form short, modular programs.

You can think of every problem you want to solve as its own language, like English, or Music, or sketching techniques, or algebra. Like a game, except you have to figure out the rules. You come up with the most elegant axiomatic rules that are orthogonal and composable, and then start putting them together. You refine what you see, and keep working at it, to find a short representation. Just like as if you were trying to find a short proof. You can extend your language, or add rules to your game, by defining new procedures/functions, objects, etc... Some abstractions are so universal and repeatedly applicable they are built into your programming language (e.g., if-statements, closures, structs, types, coroutines, channels). So, every time you work on a problem/algorithm, you are defining a new language.

Usually, when defining a language or writing down rules to a game, you want to quickly and rapidly manipulate symbols, and assign abstractions to them, so composing rules can be done with an economy of symbols (and complexity). A grid of runes makes it easy to quickly mutate and futz with abstract symbols, so that works great (e.g., a terminal). If you want to try and improve on that, you have to understand the problem is not defining a "visual programming language" that is like trying to encourage kids to read the classics by coming up with a more elegant and intuitive version of English to non-literate people. The real problem is trying to find a faster/easier way to play with, manipulate, and mutate symbols. To make matters worse, whatever method you use is limited by the fact that most people read (how they de/serialize symbols into abstractions in their heads) in 2D arrays of symbols.

I hope helping to define the actual problem you are facing is helpful?

Good luck!

Visual Boxes Aren't Code (1)

rhysweatherley (193588) | about a year ago | (#46192065)

Because while programming by joining prefabricated boxes together with lines sounds awesome, it's what is inside the boxes that is important. If the box you need is not already written, then you need variable assignment, conditionals, and loops to write a new box. And then all of a sudden you are back to writing text code even if it is drag-n-drop "if" statements encoded in XML. At that point you might as well give the programmer a text edit window and get out of the way. The lines are the least interesting part of an application, but they are the only parts that even make sense to do graphically.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?