Ubuntu Edge Smartphone Funding Trends Low
I have serious doubts that Canonical is able to deliver on this: they do not have a history of delivering top-notch software, unless you count their press-releases and boundless enthusiasm as software.
Aside from a few interesting things (upstart being among the few projects adopted outside of Ubuntu), they've basically decided to ignore whatever the rest of the community is doing and implement their own (buggy) stuff which is "better". Canonical's stuff makes GNOME3 look usable. That takes some doing.
Aside from my doubts about their ability, I also find the concept deeply flawed. Cheap support infrastructure does not currently exist for a dockable phone. Sure, you can use it as a desktop, you just need to buy a dock that you carry around, or a dock for every desk you usually use. Sure, you can use it as a phone, you just need a bluetooth headset that you have to keep charged when you're using it as a desktop. Sure, it's dual-boot, it just means that you can't phone or use the desktop when you switch modes. Sure it can do all of the above, but you have no battery life.
People who need to navigate and use their phone a lot tend to have TWO devices: a GPS or built-in satnav an a phone. Convergence is a great idea, but you're going to pay a lot in battery life for all those features. Running out of juice is NOT FUN these days.
It appears Shuttleworth is trying to emulate companies like Apple, Microsoft and Google by doing the opposite of what used to be done in the spirit of Linux. The copyright clause in all Canonical software, Mir, forking GNOME into Unity and the doublespeak pouring out of the community spokesdrones have been in stark contrast to the early days of Debian, Slackware and open culture. Maybe he really believes he's Steve Jobs and Bill Gates reincarnated and rolled into one: I really think he's got the remorselessness of the one and the ruthlessness of the other.
I believe Ubuntu has single-handedly done more to bring down the quality of Linux on the desktop than any other distro.
I believe the reason Ubuntu is so successful is because of marketing. NOT because of technical quality. This is why I believe that the human race is getting stupider every year. Ah well.
Ask Mark Shuttleworth Anything
While I don't very much like the *code* that Poettering contributes (pulseAudio took a LONG while to become stable), I see sense in a lot of his arguments:
1) SystemV initscripts are fine for systems that were designed 20 years ago. Things have changed quite a bit now.
2) After seeing some of the Apple launchctl things in action, I want some on Linux!
3) If we stick to POSIX, we might as well decide to throw in the towel, break out the old Slackware 1.0 distros and grow beards. If we can design a better interface/system that's more future proof, then DO IT.
4) Letting Upstart/SysV/OpenRC and whatever compete is *not* a good thing. It's the equivalent of having 3 incomplete kernels that allow you to run your audio, graphics or disk, one at a time.
5) Turns out pulseAudio got better AFTER PEOPLE FIXED IT UP. The architecture and the idea wasn't busted, but the execution was, for a long time.
The only reason Red Hat is upstream is because they contribute so damn much to the code. But, as Mark Shuttleworth said, Canonical contributes users and bugreports (sometimes directly to Red Hat, hilariously). Turns out, you have less control over code than the authors, go figure.
Finally, from my point of view, Unity and GNOME 3 are both abominations that should be killed with fire. I stand 100% behind Linus's statements about compatibility and ABI breakage. The fact that your app can only run on a specific distribution with a specific set of libraries is very rarely a good way to keep guys interested in developing for your desktop.
Ask Mark Shuttleworth Anything
Strange, last I heard was that Debian added it as an *alternative*. Gentoo's had their own initsystem (they switched to openrc right about when I left 'em), but, to be honest, the average Gentoo user could probably boot his PC just by flicking the power randomly to clock the bits into RAM.
Ask Mark Shuttleworth Anything
Your viewpoint on how Ubuntu and Canonical contributes back to the community notwithstanding, there seems to be a stark difference between the management style of Red Hat and that of Canonical.
The perception raised with Unity is that Canonical has decided to diverge from upstream more and more: this is evident from the problems that the Debian project (which contributes the majority of code to Ubuntu) is facing, as well as GNOME and the dissent with the upcoming signed boot EFI implementations.
Red Hat (and the Fedora project) is trying to prevent the balkanization of Linux userspace with projects like systemd, which only Ubuntu rejects.
Red Hat's business model seems to be very successful, and Canonical, despite it's massive desktop market share, doesn't seem to be able to match it in reputation or revenue. Would you attribute this to Red Hat's deeper involvement in the kernel community and higher technical skills?
Adobe Released 64-bit Flash For Linux
If you look at the timeline of the amd64 architecture: http://en.wikipedia.org/wiki/X86-64#History_of_AMD64
Then it only took 8 years to make a 64-bit port from the date of the first available amd64 machine. If you take into account the date of the first full spec released to the public, it's almost 11 years.
Now if only complex software like the Linux Kernel could be ported in shorter time....
Should Linux Have a Binary Kernel Driver Layer?
First off, a short comment on legality: Linus views binary drivers as legal ONLY IF they were "ported" to Linux. His view is, if they were written for the kernel from scratch, they're derivative work. Otherwise they don't derive from GPL code and (though undesired) are considered legal. Note that this "Linus" I am referring to may have no actual relation to the "Linus Torvalds" person who wrote Linux, it's just what I think I remember him saying.
Now that's out of the way, the first curveball that hit me after Win32 world was the idea of no stable ABI. First let's get some misconceptions out of the way:
1) Windows doesn't have a stable driver ABI.
Ask NVidia. Ask Intel. Every service pack that can potentially break a driver means that the ABI has changed! How many times did you have to update a driver after a service pack? How many times did a driver "require" a new service pack to install? Yeah, I thought so.
2) Distributions have "stablish" driver ABIs.
SLES and RHEL have got reference kernels with stable ABI calls that's well documented AND YOU HAVE THE SOURCE. It's much simpler to update drivers for Linux than for a new Windows service pack since the Enterprise versions normally stick to the ultra-stable stuff.
But, noooooo! I hear. Screw this stable stuff, stable stuff is old! And that's the crux of it. On a desktop, you don't necessarily want 99.9999% availability, you'd rather have 20 more features, but a non-serious hang once a month. The difference between the Windows model and the Linux model is: YOU NEVER GET WINDOWS KERNEL FEATURE UPDATES! You have to pay for them! And wait years and years... Yah, the service packs add features, but to most people they are considered bugfixes ;-)
So that's why the kernel shouldn't have a stable API (let alone ABI) for drivers. It prevents big architectural decisions from being taken at the opportune moment. With the kernel under git management, it's even easier to maintain out-of-tree drivers (case in point: alsa). What developers who clamour for a stable ABI don't understand is that there _is_ one! Any fork of Linux they make can be as stable as they want it to be. If they want to benefit from continual improvement, they _have_ to make some sacrifices.
Second-to-lastly, the guys biting of the short stick here are the distro maintainers. Again, there are many options: follow the source route like Gentoo, allow only free hardware like Debian, rely on the user base to spin driver packages like Fedora, Ubuntu and OpenSuSE, or go the stable route like SLES and RHEL. If they really want a stable video ABI so NVIDIA's drivers are even easier to install, get THEM to work together on it. If it's good, it'll get in to the kernel. If not, it won't.
Lastly, try the following experiment. Buy a shiny new Windows XP Home (or Pro, for that matter) CD, and install it on your brand-spanking-new hardware. Now, try to write a USB driver using only software that you don't need to pay for, and for which you'd own the binaries after you'd written it. Finding it difficult? If you got it right, PLEASE TELL ME FREAKING HOW, SINCE I DON'T KNOW! Next step, PCI driver... but hey, Rome wasn't built in a day!
Bollie hasn't submitted any stories.
Bollie has no journal entries.