×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Walter Bright Ports D To the Mac

Soulskill posted more than 5 years ago | from the sieze-the-d dept.

Programming 404

jonniee writes "D is a programming language created by Walter Bright of C++ fame. D's focus is on combining the power and high performance of C/C++ with the programmer productivity of modern languages like Ruby and Python. And now he's ported it to the Macintosh. Quoting: '[Building a runtime library] exposed a lot of conditional compilation issues that had no case for OS X. I found that Linux has a bunch of API functions that are missing in OS X, like getline and getdelim, so some of the library functionality had to revert to more generic code for OS X. I had to be careful, because although many system macros had the same functionality and spelling, they had different expansions. Getting these wrong would cause some mysterious behavior, indeed.'"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

404 comments

Shouldn't it be called P? (0)

Anonymous Coward | more than 5 years ago | (#26948465)

BCPL [erols.com]
B [wikipedia.org]
We all know about the C programming language.
So shouldn't C's successor be called P?

Re:Shouldn't it be called P? (0)

Anonymous Coward | more than 5 years ago | (#26949131)

Note that Walter Bright's company is called Digital Mars. Starts with a "D". The name wasn't necessarily chosen solely because it was the next letter after "C".

What? (0, Informative)

Anonymous Coward | more than 5 years ago | (#26948481)

Real developers actually use the Mac?

Re:What? (5, Funny)

Anonymous Coward | more than 5 years ago | (#26948539)

As little as possible. From the article:

I then figured out how to remotely connect to the Mac over the LAN, and (the Mac people will hate me for this) put the Mac in the basement and operate it remotely with a text window.

Re:What? (4, Insightful)

rhombic (140326) | more than 5 years ago | (#26948731)

Why would Mac people hate somebody for that? I ssh into my macs all the time. I pretty much always have terminal windows open. A lot of the molecular biology software I use (the open EMBOSS set of programs ROCK) are command line only, take files as input & write files as output. It's a BSD box with pretty paint. Sure, it's nice to have the pretty screens & be able to run things like iphoto & etc, but at the end of the day the most useful stuff still runs from the > prompt.

depends on the Mac people (4, Insightful)

Trepidity (597) | more than 5 years ago | (#26948953)

People who have been Mac people for a long time generally don't have that workflow, as the importing of the BSD backend is a fairly recent addition to the Mac world, whereas many of the GUI conventions have been around much longer.

Re:depends on the Mac people (4, Insightful)

pohl (872) | more than 5 years ago | (#26949023)

"fairly recent!?" Dude, that was a decade ago. I became a Mac user when Rhapsody first came out (it was the NeXT lineage that brought be onboard) and a lot of time has passed since. This reminds me of growing up in Podunk, Nebraska, in that after living their for 10 years the old ladies at the Methodist church were still referring to my mother as "the new girl in town".

Re:depends on the Mac people (3, Informative)

Trepidity (597) | more than 5 years ago | (#26949061)

OS X displaced the classic Mac OS in majority desktop usage sometime around 2002, so about 7 years out of a 25-year history.

Re:depends on the Mac people (1)

FishWithAHammer (957772) | more than 5 years ago | (#26949081)

I would say that most Mac users don't do that, though. A higher proportion of developers do, of course, but I know a couple of pretty heavy developers who use Macs and are outright hostile to the idea of using a command line. Which is funny, because one of them has been writing code on the Mac since System 7, using MPW (which had a command line!).

Mac^H^H^H people are weird.

Re:What? (5, Insightful)

avalys (221114) | more than 5 years ago | (#26948553)

A Mac is a genuine Unix workstation that is much easier to administer, and has much better software and hardware support than Linux.

I can run basically every Linux/Unix application on my Mac, both command-line and GUI, while not having to worry about wireless networking drivers, printer support, power management / sleep support on my laptop, getting accelerated 3D drivers working, or any of the other minor hassles that are involved with setting up and maintaining a Linux install.

If you walk into the computer science department at MIT, basically all the faculty have a Mac, and fully half the students do. These people are not buying Macs because they saw a cool ad on the bus - they're buying them because a Mac is the best tool available.

The argument that Macs are just expensive, "designer" PCs that look pretty and sell well because Apple has marketed them well doesn't hold water. Yes, they have nice hardware, and a clean, polished, slick UI, and that does make them more pleasant to work with than some blob of Dell plastic running Vista - but they have the functionality to back up their appearance, as well.

Yeah, they're more expensive. If you value your time at all, you should realize that spending an extra $100 on a Mac is well worth it if it improves your productivity. Hell, if you ever spend two hours fighting with some weird issue on your Linux box, it's no longer saved you any money. You know how long I've spent fighting with the OS to get my wireless working, or hibernate working, or whatever, in Mac OS X, in the five years I've been using a Mac? Zero. I'm not exaggerating. It lives up to the hype. It "just works". It gets out of my way and lets me get things done.

Re:What? (4, Funny)

Ihmhi (1206036) | more than 5 years ago | (#26948605)

So basically, Mac IS Linux on the desktop?

I think I've just given Linux fans nightmares for months.

Re:What? (0, Insightful)

Anonymous Coward | more than 5 years ago | (#26948639)

Mmmmm... No. Mac is (Free)BSD on the desktop. Sortof.

Re:What? (2, Funny)

Ihmhi (1206036) | more than 5 years ago | (#26948871)

Okay. I'll amend my previous statement.

So basically, Mac IS FreeBSD on the desktop?

I think I've just given FreeBSD fans nightmares for months.

UNIX is UNIX is UNIX (2, Insightful)

argent (18001) | more than 5 years ago | (#26948927)

I think I've just given FreeBSD fans nightmares for months.

You don't understand FreeBSD fans, then. Most of the FreeBSD users I know have Mac desktops. Jordan Hubbard works at Apple now.

Mac on the desktop, FreeBSD in the back office, it's a sweet environment and everything "just works".

Re:What? (0)

Anonymous Coward | more than 5 years ago | (#26948643)

No, Mac is unix for desktop.

Mac is UNIX on the desktop (5, Insightful)

argent (18001) | more than 5 years ago | (#26948873)

Linux is also UNIX on the desktop. It's just an oddball version of UNIX, with a whole bunch of extra APIs that people using Linux get used to and come to depend on, so they think writing portable code means "it runs on Red Hat and Suse" (or Debian and Ubuntu, if you're on the Left Hand path), and then when they go to port to a more standard version of UNIX, they write stuff like this:

'[Building a runtime library] exposed a lot of conditional compilation issues that had no case for OSX. I found that Linux has a bunch of API functions that are missing in OSX, like getline and getdelim, so some of the library functionality had to revert to more generic code for OSX. I had to be careful, because although many system macros had the same functionality and spelling, they had different expansions. Getting these wrong would cause some mysterious behavior, indeed.'

If you're writing code that depends on the expansion of system macros, or if you're depending on obscure Linux-only functions, you're writing unportable code. What really bothers me is the idea that someone writing a Linux-only program would already have run into situations where they had to conditionally compile code. Has Linux really fragmented that much?

Re:Mac is UNIX on the desktop (4, Interesting)

RedK (112790) | more than 5 years ago | (#26949053)

The opengroup would disagree about Linux being Unix. Someone still has to have it certified and it still wouldn't pass the certification because there are still missing features. Linux is compatible to most of the Unix specification, it is not Unix.

Re:What? (1)

Cillian (1003268) | more than 5 years ago | (#26948649)

I'm not going to club you into oblivion for that post, but I'd like to quiz you on basically every linux/unix app running. I'm not disagreeing, I'm curious. Are we talking the level of support that wine gives on linux? Or are we talking eeeeverything?

Re:What? (1)

leamanc (961376) | more than 5 years ago | (#26948877)

Yes, eeeeverything that can be run on Linux/Unix can be run on a Mac. A lot of things are already there, or downloadable in binary form. For most everything else, there's MacPorts or Fink. If it's not covered there, there's always downloading source code and compiling. This sometimes is a lot of extra work.

It all depends on how much extra work you want to put into it, and how familiar you are with *nix environments.

Re:What? (0)

Anonymous Coward | more than 5 years ago | (#26949043)

"It all depends on how much extra work you want to put into it, and how familiar you are with *nix environments."

So in other words, most Mac users actually can NOT run Linux/Unix apps on their desktop.

Re:What? (3, Informative)

jcupitt65 (68879) | more than 5 years ago | (#26949075)

Sadly macports and fink are pretty poor :( They don't have enough people and most of the packages are broken or out of date. I have simple patches for projects which I run which have been sitting in the macports tracker for more than six months and still have not been approved.

Debian/Ubuntu/etc. still have by far the best package repository and that's enough to make my mac almost useless and my linux laptop the place where I do most of my work. Plus OS X is rather slow, argh.

Re:What? (1)

kanweg (771128) | more than 5 years ago | (#26948669)

I'm a Mac head since I bought an LC. My company is run with Macs only. However, I have A Lot To Complain About (TM). Sound goes missing after switching accounts (can be switched back on by changing source). Initiating iChat communications fail half the time. E-mail accounts don't stick in Mail. The cursor makes sudden sweeps across the screen. When you want to fax and type a number, it starts to assume some person from the address book, but when it is not that person, you end up with a mess that is Very Hard To Edit (TM) in the fax number field. Oh, and lots more. Well, OK, no viruses, so in that respect I'm still a happy camper.

Bert

Re:What? (2, Funny)

psergiu (67614) | more than 5 years ago | (#26948727)

Sound: multiple sound cards - eh ?
iChat: your router or ISP sucks. It works ok for everybody else.
e-mail: Clean your caches. This also works ok for everybody else.
Mouse cursor: either don't let direct daylight shine on the mighty mouse or throw that junk away and get a real mouse with a opaque body
Fax: ... well ... add them all to the address book :)

Re:What? (0)

Anonymous Coward | more than 5 years ago | (#26948683)

My experience of Macs is they don't "just work" as you and Apples army claim. It'd be nice if they did, if countless weird mail.app IMAP bugs were resolved, if OSX wasn't freezing on log off, if that quicktime update didn't somehow cause iWeb to automatically convert mp3's to .movs unless they're encoded using VBR... Let's not get into the MobileMe farce.

I'd love the Mac to be the platform you're describing but that's not been my experience at all. More alarmingly I spend considerably more time fixing issues with day to day software on OSX than I do maintaining my Linux development box and I'm runnning a source based distro!

Re:What? (1)

iggymanz (596061) | more than 5 years ago | (#26948719)

hahaha. Now I have two Macs in the house, wife and kids love them. but OSX doesn't run on 1/100 the hardware that Linux will, you mean it has better hardware and software support for Apple's somewhat overpriced hardware (unless you buy four+ years old used on eBay like me). And what do you mean spend $100 more, more like a thousand or two more for new. The pile of available open source apps is bigger for Linux (or certain BSD too) than OSX, though the mac porting projects are doing well as they catch up. As far as productivity, it's the same on either platform for anything I do. Your argument about spending time resolving hardware issues, I had to look up how to make my laptop's volume thumbwheel work in Linux, that took 2 minutes. Everything else on all my boxes just work. Oh, and I have an Asterisk VOIP box with zaptel cards under Debian, how well is that going to work in OSX?

Re:What? (1)

Oswald (235719) | more than 5 years ago | (#26948767)

In general, I've learned not to argue with Mac fans (if you value reasoned discourse, that might give you pause -- I've been clubbed into apathetic, eye-rolling silence), but I must point out that the (obviously trolling) grandparent to your post asked if actual developers used the Mac. Your reply that academics and students sure use hell out of it is pretty damn funny.

Re:What? (1)

Hognoxious (631665) | more than 5 years ago | (#26948777)

A Mac is a genuine Unix workstation that is much easier to administer, and has much better software and hardware support than Linux.

Apart from if you want to run the most popular scripting language [slashdot.org].

Re:What? (1)

geoff2 (579628) | more than 5 years ago | (#26949201)

A Mac is a genuine Unix workstation that is much easier to administer, and has much better software and hardware support than Linux.

Apart from if you want to run the most popular scripting language [slashdot.org].

The lesson, which any Mac developer with a clue has known for years, is that if you want to work with your own customized perl installation, install your own Perl, don't extent the preinstalled version which Apple might change in a future update. Since installing perl on a Mac involves, basically a) downloading the software; b) typing "sh Configure -de"; c) typing "make"; d) typing "make install." Not too tough, really.

Re:What? (1)

itsdapead (734413) | more than 5 years ago | (#26948789)

I can run basically every Linux/Unix application on my Mac, both command-line and GUI, while not having to worry about wireless networking drivers, printer support, power management / sleep support on my laptop, getting accelerated 3D drivers working, or any of the other minor hassles that are involved with setting up and maintaining a Linux install.

I partly agree, but in defense of Linux, you seem to be comparing a shrink-wrapped Mac (hand built by dusky maidens from only the finest OS X-compatible components) with slapping a standard Ubuntu CD on your old Dell. If you bought a purpose-built Linux workstation you'd expect it to, likewise, have been made out of Linux-supported components and be properly configured out of the box.

You'd probably have similar driver nightmares with OS X if, rather than buy a nice shrink-wrapped system from Apple, you tried to turn an existing PC into a Hackintosh.

Re:What? (1)

SSCGWLB (956147) | more than 5 years ago | (#26948791)

OK, whats with this unix love? Yes it is a unix workstation, but so is HPUX. HPUX is ...different. You will understand if you have ever used it. Being Unix compliant does not mean a OS is good, reliable, or stable OS. Don't get me wrong, I have used OSX and like it fine. I think it can stand on its own as a OS, as apposed to relying on 'but it's Unix!'. I would rather have the free (as in beer) linux with the freedom to do what I want on my hardware/software. I much prefer the gentoo EULA to the apple EULA.

This hardware statement is a out and out lie. OSX does NOT work on a wider range of hardware, nor does it have support for more hardware. OSX will work on the laptops Apple sells. Which does not surprise me. If I am paying a premium for a laptop I do expect the OS it _comes_ with to work.

Lets take for example me Dell m1730. I installed Redhat and gentoo on it, _everything_ works. No, everything didn't just work right away, but it wasn't difficult. Good luck trying to install OSX on it. Besides violating the EULA(http://www.apple.com/legal/sla/), it doesn't 'just work'. I googled it to make sure.

I can (and have) install linux on a 486, the crappy PIII in my closet, the dual quad xeon at work. And it works. Even my odd video card and its friend the ancient soundblaster. Good luck with OSX, I am sure you can get most things working, eventually.

I have a 4 year old dell laptop that came with windows XP. Wireless works great, it suspends and hibernates great. Everything worked great out of the box. Never had a ounce of problems with it, use it every day. This does not make it a good OS. So I installed linux on it. Which works great also.

One addition point, you can pay a lot more then 100$ for a comparable laptop.

I know you love your mac, but please don't share the kool-aid.

What "being UNIX" means... (2, Interesting)

argent (18001) | more than 5 years ago | (#26948999)

OK, whats with this unix love? Yes it is a unix workstation, but so is HPUX. HPUX is ...different. You will understand if you have ever used it.

Don't talk to me about HPUX, I'm still bitter about Alpha.

Being Unix compliant does not mean a OS is good, reliable, or stable OS.

Not being UNIX would mean that it doesn't matter how good, reliable, or stable it is... it wouldn't matter, I wouldn't be using it. I've done my time in the trenches dealing with VMS, TOPS, RSX, RTE-IV, MS-DOS, CP/M, Windows, AmigaDOS, Exec/1100, many of which were by all kinds of measures good, reliable, or stable. But dealing with different operating systems sucks rotting frog innards through used oil filters, and I'm too old for that kind of manure.

Being UNIX means that, if it also happens to be good, reliable, or stable (which it does), it's worth using. If OS X was based on Copland or even BeOS I'd still be running free UNIX on my desktop.

Re:What? (0, Troll)

WillKemp (1338605) | more than 5 years ago | (#26948827)

OSX is undoubtedly a good OS, but Mac hardware is crap. It's about time they stared selling OSX on its own.

Although that might seriously damage Linux, so maybe it's better that they don't.

Re:What? (1, Flamebait)

Zero__Kelvin (151819) | more than 5 years ago | (#26948861)

"A Mac is a genuine Unix workstation that is much easier to administer, and has much better software and hardware support than Linux."

ROTFLMAO! I just learned something new! I was unaware that OS X runs on ALPHA, ARM, and the other 19 processor platforms that Linux supports.

"I can run basically every Linux/Unix application on my Mac, both command-line and GUI, while not having to worry about wireless networking drivers, printer support, power management / sleep support on my laptop, getting accelerated 3D drivers working, or any of the other minor hassles that are involved with setting up and maintaining a Linux install."

What you say is, of course, patently absurd. You are comparing a system already set up, with a bare bones system. None of MY customers worry about those things either, since they compare their already set up Linux system to an already set up Mac.

"Hell, if you ever spend two hours fighting with some weird issue on your Linux box, it's no longer saved you any money. "

Perhaps the most absurd statement you make, and that is saying a lot .

Re:What? (1)

argent (18001) | more than 5 years ago | (#26948945)

I was unaware that OS X runs on ALPHA, ARM, and the other 19 processor platforms that Linux supports.

You spelled that wrong. It's "BSD runs on Alpha, ARM, ...". Mac OS X just happens to be the best BSD version for the desktop... and it's a much better desktop than any Linux desktop.

Re:What? (-1, Troll)

Zero__Kelvin (151819) | more than 5 years ago | (#26949143)

You are very confused or intentionally trying to confuse others. OS X is a BSD derivative . It is NOT BSD. Here, have a free clue [wikipedia.org]:

Mac OS X is based on the Mach kernel and is derived from the Berkeley Software Distribution (BSD). Certain parts from FreeBSD's and NetBSD's implementation of Unix were incorporated in Nextstep, the core of Mac OS X. Nextstep was the object-oriented operating system developed by Steve Jobs' company NeXT after he left Apple in 1985.

While I applaud your efforts to help avalys weasel out of his absurd statement, you should be aware that this is Slashdot. While it is obvious that not all Slashdot readers have a clue, many (or most?) do, so you just look silly trying to pretend that OS X is the same as BSD.

"Mac OS X just happens to be the best BSD version for the desktop... and it's a much better desktop than any Linux desktop."

Now your borrowing from avalys' playbook. You clearly have no idea how to properly configure Linux for the desktop and also don't know anybody who does (or you are just trolling). I assure you that OS X users drool when they see my setup.

I've already pointed out that OS X is NOT BSD, so saying it is simply "the best BSD version for the desktop", leaving aside the absurdity of the statement if OS X was BSD, is clearly erroneous.

Re:What? (3, Insightful)

mlwmohawk (801821) | more than 5 years ago | (#26948971)

A Mac is a genuine Unix workstation that is much easier to administer, and has much better software and hardware support than Linux.

It has *better* software support from major ISVs, I will grant you that, but it does not have better software support generally and Linux supports far more hardware than MacOS. Not all Linux software runs on the macOS either.

My wife and son have macs, and I tell you, I'll take Linux every time.

I can run basically every Linux/Unix application on my Mac, both command-line and GUI, while not having to worry about wireless networking drivers, printer support, power management / sleep support on my laptop, getting accelerated 3D drivers working, or any of the other minor hassles that are involved with setting up and maintaining a Linux install.

I have a couple printers that don't work on my wife's, son's or mom's mac.

I have an AMD noname and a Dell desktop as well as an HP pavilion laptop, and I don't have any real problems. I have to use the unsupported nVidia driver, but that isn't too hard to install. Things like Skype just work.

If you walk into the computer science department at MIT, basically all the faculty have a Mac, and fully half the students do. These people are not buying Macs because they saw a cool ad on the bus - they're buying them because a Mac is the best tool available.

That is a fairly subjective statement of a dubious conclusion. Most of the guys that I have worked with use macs at work because the "organization" in which they work requires MSOffice which is not supported on Linux. They would rather use Linux or FreeBSD.

Yeah, they're more expensive. If you value your time at all, you should realize that spending an extra $100 on a Mac is well worth it if it improves your productivity. Hell, if you ever spend two hours fighting with some weird issue on your Linux box, it's no longer saved you any money.

I am less productive on Mac, and I've spent my time fixing macs as well. I have a standing free bottle of Wine from an upscale wine shop because I was able to get their printer working on their mac. They had been trying for months.

When it comes to productivity, lets see a Mac do this:

ssh -X hostaddr application

And have the GUI application pop up on a remote screen without the WHOLE screen like VNC.

in Mac OS X, in the five years I've been using a Mac? Zero. I'm not exaggerating. It lives up to the hype. It "just works". It gets out of my way and lets me get things done.

Its funny, EVERY SYSTEM has issues. People who claim they do not are lying. Like I said, my wife, mom, and son have macs. I've developed software on macs periodically for about 15 years. OS/X does have its issues. There are hardware issues on macs.

For average users I recommend mac because it has far fewer problems than Windows. For techies, there is no substitute for Linux or FreeBSD. (I prefer Linux, but I have friends who prefer FreeBSD.)

All the world's a VAX. (4, Informative)

argent (18001) | more than 5 years ago | (#26949055)

Not all Linux software runs on the macOS either.

Yeh, there's a lot of Linux programmers who wouldn't know how to write portable code if the portable code fairy shat clue down their throats. Last decade it was SunOS programmers, the decade before that it was people who thought all the world was a VAX. The world is full of people like that.

For techies, there is no substitute for Linux or FreeBSD. (I prefer Linux, but I have friends who prefer FreeBSD.)

Ask your friends about porting Linux code from people who think portable means "it compiles on Red Hat and Suse... ship it!"

Oh, while we're on the subject, you do know that Jordan Hubbard works at Apple now, don't you?

Re:What? (2, Informative)

pohl (872) | more than 5 years ago | (#26949207)

lets see a Mac do this:

ssh -X hostaddr application

And have the GUI application pop up on a remote screen without the WHOLE screen like VNC.

You can't seriously be suggesting that X11 is unavailable [apple.com] for OSX. If you have an X11 application that you would like to run, you can certainly do what you're suggesting. No serious UNIX weenie should have any trouble building it [apple.com]. Your only possible room for complaint here is 1) that that not every application is an X11 application (something for which most users are thankful) or 2) that you want your mommy to have compiled the app for you already.

Re:What? (0)

Anonymous Coward | more than 5 years ago | (#26948973)

Wow, this Mac sure sounds fantastic! I think I'm going to take Windows off my PC and install it... Oh wait, I can't do that? Why not?

Macs have great hardware support... If you buy it from Apple. It's the same deal as if you buy from a "Linux OEM." (And no, I don't mean companies like Dell that don't even bother to make their pre-installs work properly)

Are you insane? (1, Insightful)

gbutler69 (910166) | more than 5 years ago | (#26949003)

You are not going to seriously suggest that Mac OSX supports more hardware than Linux are you?

I have a System76 Laptop running Ubuntu Linux 8.10. I have ZERO problems with hardware support of ANY KIND!

  • Wi-Fi - CHECK!
  • LAN - CHECK!
  • 3-D Desktop with BEAUTIFUL effects - CHECK!
  • Games - CHECK!
  • Productivity Software - CHECK!
  • Music Software - CHECK!
  • Video Software - CHECK!
  • 3-D Modeling Software - CHECK!
  • Imaging Software - CHECK!
  • Vector Graphics Software - CHECK!
  • Programming Environments and IDE's - CHECK!
  • Wireless 3-G Broadband with Verizon - CHECK!
  • Anything else I can imagine - CHECK!

Your entire post is a complete puddle of steaming ass-goo for anyone who actually is in the know about GNU/Linux/Ubuntu. It truly is laughable!

Re:What? (2, Insightful)

jonaskoelker (922170) | more than 5 years ago | (#26949017)

you should realize that spending an extra $100

I think you forgot a zero. Prices may be different in my-vs-your neck of the worlds, though.

and has much better software and hardware support than Linux.

I think Linux has much wider hardware support (it works on non-apple hardware too), whereas OS X has full support for a much smaller set of hardware. What's better depends on personal preferences.

I can run basically every Linux/Unix application on my Mac

Really? Didn't the summary just say that some of the system calls are missing on OS X and some macros are different? Or did you mean running the binaries (in which case, there's still the system calls)? Or do you cheat by running Linux on your Mac? ;-)

You love your Mac, and that's great for you (really, I mean that). But I think you might be overselling it just a little. Bullshit detector went from green to yellowish green ;-)

Re:What? (1)

argent (18001) | more than 5 years ago | (#26949195)

Didn't the summary just say that some of the system calls are missing on OS X and some macros are different?

No, the summary said that the code was written with unnecessary dependencies on obscure libraries that didn't happen to be shipped on OS X, and that it was so buggy it depended on the expansion of macros being identical on different operating systems.

I can pretty much guarantee that any program with those kinds of dependencies would have just as much of a problem porting to any other UNIX system. It wasn't a "Linux/Unix" program, it was a "Linux-only" program.

I like Macs, but they're not easy to administer (1)

Trepidity (597) | more than 5 years ago | (#26949033)

OS X is easier to operate, and is set up to work well with Apple hardware. Those are nice features, and the reason I have an OS X laptop. But it surely isn't easy to administer. It's indeed quite terrible, since there's no package management to speak of.

Ever tried to walk through a classroom of mac-using high-school students how to get pygame installed and working on their OS? Including the proper version of python if their OS X has an old version, PyObjC, and so on? On Debian or Ubuntu, this requires double-clicking "python-pygame" in Synaptic, and all dependencies/upgrades/paths/etc. are handled automatically. In OS X it's a giant pile of manual bullshit.

And god help you if you have to resort to some mish-mash of .dmg installers, fink, and macports.

Re:I like Macs, but they're not easy to administer (1)

argent (18001) | more than 5 years ago | (#26949147)

Ever tried to walk through a classroom of mac-using high-school students how to get [some non-portable piece of Linux-only code that was written with the assumption that it would only ever be run on Linux]

That's not a "package management" problem. FreeBSD has had better package management than Linux since before Linux HAD any package management system, and getting random "all the world's Red Hat" (or "all the world's Ubuntu") code running can be a pain and a half. That's a problem caused by writing code with billions of unnecessary dependencies.

Dependencies are a problem. Any sensible project manager will identify all dependencies and eliminate as many of them as they can. But "sensible project management" and Linux just don't belong in the same sentence.

I've run into the same problem on Linux, I spent a week once trying to get a set of packages and scripts working so I could package up some software to ship to customers, on Red Hat, and I ended up having to use the FreeBSD port of the package to figure out what all the dependencies were. I couldn't depend on them being able to use "yum" to pull it all together, because this was for an install that had to work on an offline system. Telling electric power utilities they had to connect their offline control system to the internet to do an upgrade was NOT an option.

Package management systems of the complexity of the ones Linux uses are a result of years of bad project management.

Re:What? (1)

sammyF70 (1154563) | more than 5 years ago | (#26949213)

hmm . excuse me, but : better hardware support?
Since when? Can I just take, let's say, some non-apple graphic adapter I've got lying around, plug it in, and it will work? Or go out and buy some random wireless adapter?

Better sofware support is arguable, but better hardware support, I think not

Re:What? (2, Interesting)

dkf (304284) | more than 5 years ago | (#26948645)

Real developers actually use the Mac?

Of course. The MacBook and MacBook Pro are nice laptops for on the move, and it runs ssh, gcc, vi, emacs and X11 perfectly.

Re:What? (1)

evil_aar0n (1001515) | more than 5 years ago | (#26948687)

Yes. At work, they've given me a kick-ass Dell - serious high-end piece of machinery - and I almost never touch it. Instead, aside from e-mail, all my efforts are through my personal MacBook Pro. Even if I'm VNC'ing over to my Solaris session, I still use the Mac.

Re:What? (0)

Anonymous Coward | more than 5 years ago | (#26948823)

Apple "developers" (a.k.a. associates) use a modern program called Hypercard to create softwares known as spreadsheets excel. And they must do this with pictures because Apples do not understand words in the normal sense but prefer pictures called gooey. I hope this sheds more light on Apple "developers"

Re:What? (1)

Comatose51 (687974) | more than 5 years ago | (#26948993)

The damn things ship with developer tools (Python, Ruby, etc.) and SDKs out of the box. Not sure if you're from the Bay Area or Silicon Valley but those things are really popular among software developers here. I switched to a Mac from Windows because I was sick of my tools getting in the way of my work. Not sure if things have improved much since Windows Vista (heard otherwise) or Windows 7 (heard good things) but I'm quite happy with Mac OSX that it would take a lot to make me give it up.

First comment ever! (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26948499)

First comment ever!

Apple floats my boat (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26948543)

Friday is my busy day. And my apple notebook helps be get the "job" done. You see, after I get off work as a substitute gym teacher, I have to race across town to the NAMBLA meeting. [nambla.de]

Then it's a quick pit stop at McDonald's for a bite and to check out who is having a Happy Meal in Playland. Afterward I must zip uptown to the Apple Users' Group meeting. Finally by 11:00 pm it's time to head home with my PowerBook and scope out the K12 chat rooms.

As I said, it's my busy day. If it weren't for my Apple computer, I don't know how I could do it all.

Re:Apple floats my boat (0, Troll)

Anonymous Coward | more than 5 years ago | (#26948775)

I don't mean to be presumptuous, but might I suggest a more efficient way? If you would combine the Apple Users' Group meeting with the NAMBLA meeting, you could save a lot of time (not to mention gasoline!). After all, the membership rosters are practically identical. This would give you more time for the K-12 "chat rooms" and "Happy Meals" at Play Land. Just my 2 cents.

Re:Apple floats my boat (0)

Anonymous Coward | more than 5 years ago | (#26948817)

This isn't fair, I am sure there are plenty of mac users without a fondness for young boys.

D sucks (-1, Flamebait)

Anonymous Coward | more than 5 years ago | (#26948549)

C++ rules.

Re:D sucks (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26948573)

Woho! Yeah!

Re:D sucks (-1, Flamebait)

Anonymous Coward | more than 5 years ago | (#26948653)

C++ sucks. Visual Basic rules.

Holy design problem, Batman! (0, Troll)

captnjameskirk (599714) | more than 5 years ago | (#26948569)

Problem after problem was traced back to the use of global variables. Over time I'd eliminated a lot of them, but there's a lot left. It's hard to change how a function works if there's a back channel of globals passing state around. Globals break encapsulation, making code difficult to understand.

Sounds like it's time to refactor some code there Walt.

High performance of C++ equal to D??? (3, Interesting)

master_p (608214) | more than 5 years ago | (#26948589)

I don't think D will ever have the high performance of C++, because D objects are all allocated on the heap. The D 'auto' keyword is just a compiler hint (last time I checked) to help in escape analysis. D has structs, but one has to design upfront if a class has value or reference semantics, and that creates a major design headache. Avoiding the headache by making everything have reference semantics negates the advantages of struct.

D is a mix of C and Java, with the C++ template system bolted on top. It is no way C++. D is not what a veteran C++ programmer excepts as the next generation C++.

Re:High performance of C++ equal to D??? (0)

Anonymous Coward | more than 5 years ago | (#26948627)

Maybe your right, I can't say.
Maybe D's author was not solely concerned with being on par with C++'s performance. :)

Re:High performance of C++ equal to D??? (5, Informative)

Anonymous Coward | more than 5 years ago | (#26948635)

http://www.digitalmars.com/d/1.0/memory.html#stackclass - Objects in D are not always allocated on the heap. Also, you've clearly never used templating in D if you think it is the C++ template system bolted on top ;)

Re:High performance of C++ equal to D??? (-1, Flamebait)

Anonymous Coward | more than 5 years ago | (#26948655)

Don't forget that it uses parentheses for template instantiation instead of angle brackets. How am I supposed to tell the difference?

Oh, and it doesn't allow you to overload stuff like the address-of operator. That makes it impossible to be productive!

I ORDERED A D COMPILER FOR MY COUSIN WHO HAD CANCER. COMPILER NEVER ARRIVED AND MY COUSIN DIED.

Re:High performance of C++ equal to D??? (5, Informative)

ardor (673957) | more than 5 years ago | (#26948681)

The GC is the way to go for complex application. The reason is simple: the GC has a global overview over all memory usage of the application (minus special stuff like OpenGL textures). This means that the GC can reuse previously allocated memory blocks, defragment memory transparently, automatically detect and elimitate leaks etc.

Somewhat less obvious is that a GC allows for by-reference assigments being the default. In C++, by-value is default. a = b will always copy the contents of b to a. While this is OK for primitive stuff, it is certainly not OK for complex types such as classes. In 99.999% of all cases, you actually want a reference to an object, and not copy that object. But, as said, the default behavior of assignment is "copy value".

This is a big problem in practice. The existence of shared_ptr, reference counting pointers etc. are a consequence. We could redefine the default behavior as "pass a reference of b to a", but who will then take care of the lifetime of b? Using a GC, this last question is handled trivially; when the GC detects 0 references, b is discarded.

Now, once you have by-reference as default, things like closures get much easier to introduce. Neither D nor C++ have them at the moment, but C++0x requires considerably more efforts to introduce them. Functional languages all have a GC for a reason.

D did another thing right: it did not remove destructors, like Java did. Instead, when there are zero references to an object, the GC calls the destructor *immediately*, but deallocates the memory previously occupied by that object whenever it wishes (or it reuses that memory). This way RAII is possible in D, which is very useful for things like scoped thread locks.

They also simplified the syntax, which is one of the major problems of C++. Creating D parsers is not hard. Try creating a C++ parser.

Now, what they got wrong:
- operator overloading
- const correctness
- lvalue return values (which would solve most of the operator overload problems)
- no multiple inheritance (which does make sense when using generic programming and metaprogramming; just see policy-based design and the CRTP C++ technique for examples)
- crappy standard library called Phobos (getting better though)
- and ANOTHER de-facto standard library called Tango, which looks a lot like a Java API and makes little use of D's more interesting features like metaprogramming, functional and generic programming

Re:High performance of C++ equal to D??? (0)

Anonymous Coward | more than 5 years ago | (#26948925)

D2 has full closures.

Re:High performance of C++ equal to D??? (0)

Anonymous Coward | more than 5 years ago | (#26948965)

Somewhat less obvious is that a GC allows for by-reference assigments being the default. In C++, by-value is default. a = b will always copy the contents of b to a. While this is OK for primitive stuff, it is certainly not OK for complex types such as classes. In 99.999% of all cases, you actually want a reference to an object, and not copy that object. But, as said, the default behavior of assignment is "copy value".

Wrong. I made the same mistake before. I'm sure you're thinking of Java or C. Everything there is actually copy-by-value. Everything. Pass-by-reference refers to the fact that you pass the variable by reference, not the object's instance. All "references" in Java are actually pointers to the instances (hence the pointer value gets copied). It's particularly confusing because in Java `.' is more akin to the `->' in C++ than the `.' operator. Think about it:

String foo = "abc";
bar(foo);
System.out.println(foo);

Is there any way in Java or C for the system to print out anything but "abc"? In C++ there is:

void bar(String &x)
{
      x = "cba";
}

The system will then print "cba" instead of "abc". So in 99.999% of cases, you don't want a to pass by reference or to copy the object - instead, you want to pass a copy of the pointer to some instance.

No multiple inheritance is pretty good from a readability & codability viewpoint, as long as there is some mechanism that allows you to more clearly express your concept (i.e. Java interfaces). Multiple inheritance creates confusion potentially - which parent am I inheriting function foo from if both of them declare it? What is the destructor order if parents have virtual destructors?

Not sure what your problems with operator overloading are. Presumably you think that the return value should be part of the method's signature. Not sure if this is a good or bad idea - maybe it creates issues for compiler writers & ambiguity in the error messages?

Not sure how you think they've got const-correctness wrong - aren't they just closing the loophole in C/C++?

Re:High performance of C++ equal to D??? (1)

ardor (673957) | more than 5 years ago | (#26949137)

I'm sure you're thinking of Java or C.

Wrong. I am thinking of C++. Count the times you pass a reference or a pointer vs. the times you actually want to copy large, complex objects. The outcome is pretty clear. Hell, boost even contains a "noncopyable" class as a tool to disallow deep copies.

Deep copies are the exception. Shallow copies are the rule. This is totally obvious, and virtually all other languages do it this way. The only two reasons why C++ doesn't have it that way are the C legacy and the ownership problem (which is solved by the GC).

Yet another problem with by-value as default are move semantics. It basically boils down to the problem of extra copies in return values. C++0x sort of fixes it with the aforementioned move semantics. See http://www.codeguru.com/cpp/misc/misc/versioninfo/article.php/c14443 [codeguru.com] for more. Note that this issue is an artifact of the by-value-default design decision in C++.

So in 99.999% of cases, you don't want a to pass by reference or to copy the object - instead, you want to pass a copy of the pointer to some instance.

Wrong conclusion. Your example just shows that the assignment operator is ill-defined. Besides, this is possible in D, since you can overload the assignment operator.

And, how to do actual copies of the object? Since doing *deep* copies is rarely done on purpose, a clone() method is the way to go.

No multiple inheritance is pretty good from a readability & codability viewpoint, as long as there is some mechanism that allows you to more clearly express your concept (i.e. Java interfaces). Multiple inheritance creates confusion potentially - which parent am I inheriting function foo from if both of them declare it? What is the destructor order if parents have virtual destructors?

I take it you never did C++ metaprogramming then? Never implemented policies, CRTP and the like?

Not sure what your problems with operator overloading are. Presumably you think that the return value should be part of the method's signature. Not sure if this is a good or bad idea - maybe it creates issues for compiler writers & ambiguity in the error messages?

See this example:

    x[5] = 1;

How is this done in C++? x' index operator returns an lvalue (e.g. a non-const reference), whose type in turn defines the assigment operator. This is the right way, since retrieval and assignment are two orthogonal concepts. What if x is some sort of container class? If there were a "[]=" operator, the container class would have to define retrieval *and* assignment. But the latter is not the container's job.

D has an opIndexAssign operator, which is "[]=". Fortunately, there is progress; an opIndexLvalue operator has been accepted into D2. However, opIndex would be sufficient if it were possible to define the return value itself as an lvalue.

Re:High performance of C++ equal to D??? (0)

Anonymous Coward | more than 5 years ago | (#26948809)

No. "auto" is for type inference and has nothing to do with escape analysis. And you should not be mixing value and reference semantics in the same type. And you should not make general statements when you don't even know what you're talking about.

mod Up (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26948603)

And she 8an Don'tR walk around

digital mars c++ (1)

drac667 (878093) | more than 5 years ago | (#26948611)

From what I've understood his D implementation and Digital Mars C++ share some code. Would he bring Dital Mars C++ to MAC also? Or to Linux for that matter?

Tempting (0)

Ihmhi (1206036) | more than 5 years ago | (#26948623)

I'd be too tempted to be able to pick out my own name for a code.

You can go the outright vulgar route and call it something like Cock++. I mean, can you imagine inserting that terminology (no pun intended) into a workplace environment?

Or I'd just have as many terrible puns in it as I could... perhaps call it Hay, so I can confuse people with stuff like the Stack.

semi (0, Troll)

azav (469988) | more than 5 years ago | (#26948629)

Can we FINALLY kill the semicolon at the end of a line? The default condition is that the end of a line terminates the command. Forcing the user to enter a semicolon command terminator when they have already indicated they want to end the command be pressing return/enter forces the user to say next command twice. Why must we promote bad design forwards?

Re:semi (1, Insightful)

Anonymous Coward | more than 5 years ago | (#26948661)

I think you need to study some more C, C++, Java or C# code and you'll notice not every end of line terminates a command..

Re:semi (0)

Anonymous Coward | more than 5 years ago | (#26948685)

more than one whitespace does not matter in C.

Re:semi (0)

Anonymous Coward | more than 5 years ago | (#26948667)

Because

for(x=0;x<y;x++)

uses less space than

for (x = 0
    x < y
    x++)

Re:semi (1, Funny)

Anonymous Coward | more than 5 years ago | (#26948673)

Oh yeah the semicolon is a MAJOR obstacle for every programmer...

Re:semi (2, Informative)

theurge14 (820596) | more than 5 years ago | (#26948675)

Because the compiler ignores whitespace it's probably not the best design decision to let a non-visible character be the end-of-line terminator.

Re:semi (1)

WillKemp (1338605) | more than 5 years ago | (#26948803)

I agree. I can't get my head around the insanity of python's system of indentation delimiting block structure...

What's wrong with lines terminating with semi-colons anyway? It encourages much tidier and more readable code, i reckon.

Re:semi (0)

Anonymous Coward | more than 5 years ago | (#26948943)

really;I'm:shocked:that:you'd:say:so;I've:seen:much:much:much:more:completely:
unreadable:and:unintelligible:C:C++:Java:and:C#:code:than:I've:seen:unreadable;python:code:in:my:time;

Usually this is a result of some idiot thinking that better_performance==fewer_source_statements.

Re:semi (1)

ejtttje (673126) | more than 5 years ago | (#26948977)

I wonder: do use spaces for indentation?
I use tabs, and the indentation-equals-block makes a lot of sense to me.

Re:semi (1)

WillKemp (1338605) | more than 5 years ago | (#26949047)

I use tabs for indentation. But i don't use python.

Maybe i'm wrong, but using an invisible character (which could be a/some tab(s) or a/some space(s)) seems like it could make the code very hard to read and prone to errors.

Re:semi (1)

ejtttje (673126) | more than 5 years ago | (#26949161)

To me, it enforces that the code *is* easy to read, by ensuring the visual indentation matches the semantic intention.
There's never a mismatch between what you intuitively see and what it actually executes. Which is probably why they let you mix tabs and spaces as long as it's consistent visually. Although personally I think it should be a syntax error to mix them (or better yet, an error to use spaces for indentation in general ;-p)

Re:semi (1)

Just Some Guy (3352) | more than 5 years ago | (#26949183)

Maybe i'm wrong

You're wrong. In the last 6 years of our company using Python for business logic development, we've never had a single bug that was traced back to indentation or an end-of-line problem.

Re:semi (1)

Inf0phreak (627499) | more than 5 years ago | (#26948699)

There are very good reasons why we have stuck with the ; as statement terminator instead of newline. Not the least of which being that systems simply don't agree on what bytes constitute a newline. Windows thinks it's \r\n, UNIX-ish systems that it's \n and (pre OS-X? Haven't ever owned one myself) Macs think that it should be \r. And then you have things like "logical newline" e.g. std::endl That is not to say that putting some more work into compiler error messages wouldn't be a good idea. I assume most C++ programmers have tried receiving a _huge_ amount of garbage template crap as an error message instead of "missing semicolon" when trying to output text with std::cout.

Re:semi (0)

Anonymous Coward | more than 5 years ago | (#26948749)

Not the least of which being that systems simply don't agree on what bytes constitute a newline. Windows thinks it's \r\n, UNIX-ish systems that it's \n and (pre OS-X? Haven't ever owned one myself) Macs think that it should be \r.

Oh noes, having to check for a whole THREE possible cases! However can we solve this logistical nightmare!

Re:semi (0)

Anonymous Coward | more than 5 years ago | (#26948905)

Yes, the English language is at peril, what with its ?, ., and !. We must replace them with linefeeds!

I really don't see the problem with this (1)

localroger (258128) | more than 5 years ago | (#26948785)

Simply treat both cr and lf as endlines; a compiler isn't formatting output and ignores a blank line so treating \r\n as two endlines doesn't matter. Use a visible character like the _ in vb to indicate that the next nonblank line is a continuation. I have to agree with parent that ignoring whitespace is one of the dumbest ideas ever. How often do you really needa 140 character line, versus the frequency of lazy programmers writing unreadable code because the language doesn't require even minimal formatting?

Re:semi (1)

acidblue (716452) | more than 5 years ago | (#26948787)

No one has ever been able to convince me that removing the terminator is a good thing. What happens when the sentence does not fit the width of the media in which it is being displayed? These statements will then wrap (because most of us hate horizontal scroll bars, or we are editing in environments that don't have scrollbars) and then we have to enable white space character markers in the editor. So, this becomes combersome. Sure python does it and so does groovy and vb (and many others), but I find this an annoyance.

Go ask any real Javascript programmer and they will tell you that they wish semi-colons were mandatory and not optional.

So, I ask, what is so hard about a little programmer punctuation?

Briggs;

D, E, F,... where will it all end? (1)

Tiger4 (840741) | more than 5 years ago | (#26948733)

Now he's done it. Kicked off the alpabetical arms race. He himself averted it once by the Operator Gambit of + and ++. The crisis was defused.

It was stumped again this decade by what came to be called the Jungle of Earthly Delights Diversions (Python, Ruby, et al). But now he's broken the dam to the last redoubt, the alphabet. There are only so many letters! It is a limited supply! We've feared this since the mid-nineties. Who will save us now?

ZZ99++ (5, Funny)

itsdapead (734413) | more than 5 years ago | (#26948891)

...or there's always Greek, Hebrew, Klingon* and, hey, Chinese (that should keep us going for a while)!

Why do you think they invented Unicode?

(*Fatal Error at line 16349: statement has no honour)

Duh. (1, Informative)

Anonymous Coward | more than 5 years ago | (#26948805)

getline() and getdelim() are GNU extensions. Apple doesn't particularly like GNU. The FSF doesn't particularly like Apple (for good reasons). So, duh.

D -- wha? (4, Insightful)

ansak (80421) | more than 5 years ago | (#26948811)

I think the fact that this post has been up for almost an hour and has only 33 follow-ons shows what the software community thinks of D.

One has to acknowledge that Back in The Day, Walter Bright did all of us a great service in producing the first PC-based C++ compiler (Zortech) which effectively forced Borland and Microsoft to take the language seriously.

Unfortunately, for all of us, he seems to be better at invention than collaboration but that doesn't devalue the contribution he made (structurally) to get us to where we are.

cheers...ank

why all the hate? (5, Insightful)

Bobtree (105901) | more than 5 years ago | (#26948841)

The griping and misinformation here is so atrocious that I'm simply embarrassed to be reading Slashdot today.

Digital Mars D is a wonderfully designed language and I'm in the process of giving up a lifetime of C++ for it.

I'm not here to defend D or enumerate it's growing pains or evangelize it, but if you don't take it upon yourselves to be well informed, please don't repeat your biased gibberish to the rest of us.

Another programming thread. (0)

Anonymous Coward | more than 5 years ago | (#26949009)

It isn't C, so therefore it doesn't *really* have a place, since it is obviously more bloated and slow than the mighty C and its strcpy. Doesn't seem like great developers really post here, anyway, based upon the comments in any programming article.

It always turns into:
1. inevitably some Javabot wanders in and makes condescending comments about how the amazing enterprisey framework they use that uses the Adapter Bridge Factory SingletonController Reducer pattern is superior to all, you just have to set up six XML config files in certain places and run two preprocessors on your code. But then when you do, its just amazingly ELEGANT!

2. People chant the same tired memes over and over without really knowing what they're saying: "LOL PERL IS LINE NOISE!"

3. Usually some college kid with a semester of CS pops by and writes several pages about how sad it is that everyone is using high level languages and wasting all the processing power of the computer. Also, how they're using Linux instead of Windows (because Linux is written in C) and how it is much faster. All this without realizing that the browser they're writing it in probably has significant portions written in Javascript.

As usual for Slashdot, there is zero nuance in the discussion. Everyone assumes their experience is what applies to everyone, and their needs are what everyone else needs.

Re:why all the hate? (0)

Anonymous Coward | more than 5 years ago | (#26949125)

First impressions are profound. Worse: First impressions of features (garbage collection, for example) taint even the best implementations of those features.

People don't like to relearn what they think they already know.

No one cares about D (4, Insightful)

Kuciwalker (891651) | more than 5 years ago | (#26949007)

Seriously, name me a piece of commercial or open-source software with significant market share written in D. Library support is about 10000% more important than actual language design.

So fucking what! (1)

FlyingGuy (989135) | more than 5 years ago | (#26949151)

Fabulous, he ported a language to the MAC. Ok groovey...

The bit about having to make adjustments to the library code is news why?

Not all OS's support everything in the world, that is just the way it is. If you implement a certain function or macro one way on one platform does not in any way mean that you will be able to implement exactly the same on another platform.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...