Systemd's Lennart Poettering: 'We Do Listen To Users'
Consider, RAID and similar are high availability features. Their whole reason to be is making sure the system is available even if a drive fails. Systemd single-handedly defeats that whole purpose by refusing to even try to mount the root filesystem.
At first reading, I was mislead by thinking you were talking about RAID and similar high availability features in general, which work far better with systemd and LVM than with sysvinit where it was a nightmare prone to break on any boot.
Then I understood you were talking about these features inside btrfs.
And that's where you're yet again one of these people spreading FUD on systemd by being ignorant, be it on purpose or not.
How is software RAID and other features like it (like snapshots) handled on Linux? Yes, by one of several daemon, in the LVM package, that manage the state of the virtual devices. That's because, like was explained by the systemd developers, despite these features being implemented inside btrfs, the kernel still doesn't have enough states to give to userspace : the disk is either plugged or not.
And this is not at all driven by systemd but by udev, which yes, is in the systemd repository.
The issue here is that btrfs lacks the daemon to manage the virtual devices in userspace, and the kernel, if your btrfs RAID is degraded, say the disk is dead, thus why udev doesn't show the btrfs RAID as plugged, and thus why systemd doesn't mount it.
All of this was explained, but you chose to spread FUD by blaming systemd as being shit and breaking userspace.
It works with sysvinit because your script doesn't care for which disks are up or not, and will just break your setup any time some disk doesn't show up.
That's really a poor showing, but the insight it gives me into the project is even worse. It tells me that in spite of the importance of redundancy (some enterprises spend gadzillions on it) and the fact that it has worked well under SysVinit for over a decade, not one person on the systemd team even considered it.
This is a lie, btrfs didn't work well for a decade to begin with, btrfs is not even a decade old.
This just shows up to what ends systemd haters are willing to go to spread FUD.
And don't tell me RAID and SAN worked well with sysvinit, this would be another blatant lie.
Now that it has been brought to their attention, they can't even come up with a workaround for it (see what I said above about do what I say and do it now). All I need is an unconditional 'mount -a' and apparently it can't be done. In spite of that, the various systemd boosters refuse to admit the problem even exists. I have even had a few claim it's a feature meant to protect my data.
So there it is. It's not a matter of opinion, it's a simple boolean: "Did my system boot" and the objective answer is no. There is the followup, "how then, can I make systemd boot it" and the answer is [crickets].
I described the real situation up there, and everything can be found in ML archives, so everyone can see that eveything you wrote there are blatant lies. It's scary at this point.
Even the situation you described is wrong. You actually asked "Did my broken system boot?" and the objective answer is no. A good btrfs RAID would have booted, then even if a disk failed, it would stay functional. But it wouldn't boot in degraded state, just as most hardware RAID by default in fact.
Ask Slashdot: Migrating a Router From Linux To *BSD?
Now I'm starting to believe in a conspiracy against Free Software.
I find it odd to see such nonsense written here, being modded up.
Because what is described here (making the product closed source) is just not possible to do for Free Software like systemd, but is entirely possible (and has been done before and up to this day) with BSD.
Yet, this explanation of why a free software liecensed init system (systemd) is like a closed source operating system (Windows) appears here, is based on nothing solid, but people seem to believe this nonsense nonetheless.
And there have been a big amount of articles about moving from Linux to BSD since six months ago, and I'm now starting to believe this is astroturfing.
Even the move to BSD because of an init system makes no sense to me.
What I'm sure about, is that to this day I'm unable to make my own Windows OS, but I'm still able to do my own Linux systems (all my Linux systems at home are custom made from upstream sources) even today despite having moved to systemd years ago.
Though it's not a bad thing to go learn other OS out there for an admin, I more or less know most of the Unix and Windows systems, but not Mac OS.
SystemD Gains New Networking Features
Whether or not you like it, it's not unfair to classify systemd as being "forced" on its users.
Users of systemd include distribution builders, daemon makers and administrators. It's clearly wrong and unfair to say systemd has been forced on these people, unless they're incompetent. Actually, most of these people are happy with the change. Not all administrators obviously as it requires some work and learning to switch your init daemon and its accompanying system startup, shutdown and live management process.
For a start, it's wildly popular with distribution builders, but this doesn't mean jack with anyone else.
This is plain wrong, I've seen popularity and interests among daemon makers and administrators too : the only people that had to deal with sysvinit and suffered decades from its insanity, especially for low level stuff like network and storage management.
Secondly, for a while (thought they've promised to me that they're trying to and maybe have by now fixed it), GNOME had a hard dependency of systemd. Being the most popular desktop environment more or less forced the hand of many of the distro builders too.
Again a lie. Gnome has a hard dependancy on systemd-logind API, this is like that because Gnome asked for a replacement of consolekit for months, and there was two sides : those that did the work (only systemd) and those that said "You just have to do this" without any work done.
The same applies with the console on userspace and the linux kernel.
To me, the whole thing seems odd. I've never seen a massive infrastructure change sweep so rapidly through the community of distributions. Especially such a major component, and double especially when things did actually work successfully before.
To me, what's odd is the time it took (15 years since I've seen the problem) for a solution to the problem to appear, and also the 5 years it took since I discovered systemd for it to be adopted.
Things didn't work successfully before, the people believing that are those that never had to fight their way through init scripts for all these years. The massive adoption is not at all surprising to me, this was needed for years.
An evidence is the fact that even those against systemd are afraid, among other things, of the cost of maintenance of init scripts, because they don't work successully to this day, are full of hacks, and so require lots of maintenance that "you just have to do" talk won't do.
SystemD Gains New Networking Features
It annoys me that someone like Poettering, who only had PulseAudio come into use because of the ability distributions had to easily change core operating system components (and wouldn't have had the existing audio-subsystem been entrenched), would then proceed to develop something specifically intended to lock down its own existence and prevent its replacement by something else. It's hypocritical.
While I totally understand why he did it -- nobody wants to put a great amount of time into something only to have it superseded -- it flies in the face of open source in general, where you contribute to an evolving 'thing', and that while your specific contribution may not exist in the future, you can be happy that you took part in the evolution of the whole, and not feel the need to stamp your face on it for perpetuity.
What flies in the face of open source is actually people like you that waste time whining about people that actually code open source.
Complaining is all you're able to do, just go help the Devuan project instead.
Since Devuan appeared, there's still lots of people complaining and others dedicating efforts to go to BSD, and these people love to advertise that they're moving to BSD. But we don't care ! Just move to BSD already and leave the people on Linux alone.
It also sets a dangerous precedent. What's going to be locked down next, in the name of stability, or speed, or whatever else (when it's really about someone trying to 'make their mark'?) Do we lock down the file system? Only one file system for Linux, full stop? Do we lock down the network transports? The window manager? The terminal? The command-line applications?
What's this hyperbolic nonsense ? Nothing has been locked down here. Actually with systemd, nothing is locked down at all, I even have more power over my Linux now than ever. These doom and gloom stories for Linux because of systemd are just ignorant rants. People that actually use and understand systemd don't care because these rants make no sense.
SystemD Gains New Networking Features
unless of course you want to start a script with a unit file but then are you sure that iptables is up?
In all my time using Linux, wondering if iptables had crashed has never been a problem I've had. I've had lots of problems, but never that one. Same with filesystems. Fstab has always just worked.
You have not really worked in heavy duty Linux environments then. I experienced fstab failures tons of time, be it because fstab is buggy, hard drive is dying, memory is dying, emergency shutdown, ...
I agree that the problem with iptables usually is not if it has crashed, it's rather things like path of executable or features parity with the kernel.
And an extra layer in front of iptables is the last thing I need. That is a huge negative. I don't even understand why anyone would think that's a good idea.
And yet, this library (libiptc) is available in the upstream iptables package. So you're saying that's a huge negative, despite it being provided upstream by the iptables developers, and you don't understand it just because you don't need it. So you're acknowledging you're incompetent as a sysadmin and developer, and how much you're not qualified to talk about this topic. Just like most haters who have nothing constructive to say here. I've yet to see any sane rebuttal to this, only angry people (you have the right to be angry, even on sth you demonstrably don't understand), especially since systemd-networkd is optional with systemd, and this feature is optional too in systemd-networkd.
But all sane admins with systemd use it I believe, and threw away their old scripts, that were doing the job of setting up the network but worse.
SystemD Gains New Networking Features
Even worse, try requiring LDAP (not just making it an option when an account isn't found locally, actually requiring it) for logins on a system booting via SystemD. Have your recovery media handy, you'll have to boot from it in order to remove the LDAP requirement when SystemD can't su because the network isn't up yet (or, if the LDAP server is localhost, slapd hasn't started because, guess what, it needs to su to its configured user during its init process).
What is this nonsense ? I use LDAP since before adopting systemd in 2009, and I never had any of these problems on my own made Linux systems.
Are you implying distributions are even worse than when I quit using them in 2000?
Are you implying that distribution maintainers are incompetent?
What I'm sure is that it's a distribution problem, not a systemd problem. I never had the problem you describe which doesn't even make sense, even before I used sssd, and my /etc/passwd contained only root and messagebus accounts (now it also has the new systemd accounts).
And my systemd systems work just fine even when network is not up yet, localhost is always there and slapd as well as sssd listen on it.
Major issue affecting Ubuntu and, as far as I know, all Debian-based systems. The workaround should be simple: allow local account logins right up until TTYs actually become available, regardless of configuration. But, apparently, LDAP isn't considered important, so this has been an issue for as long as Debian has used SystemD and will likely remain so until Debian moves on to something else.
This is BS, I can't even believe Debian and Ubuntu maintainers are so bad, where's the bug report ?
The current "recommended" workaround is a pair of ifup/down scripts that requires LDAP when the interface is up and makes it optional when it interface is down, which is great until your system crashes or you lose power and the "optional" config doesn't get applied. Then, it's time to whip out the recovery media so you can manually change the config and have a bootable system again. Needless to say, I refuse to implement that hack of a fix.
So this must actually be a bug tied to sysv compatibility, as you're talking about these broken ifup/ifdown scripts.
Granted, I instantly moved to full systemd configuration, it's not so easy for a distribution. Most of their problems seem to come from the compatibility layer.
sysvinit scripts are so insane, I can't blame them for having problems with them. Surely it makes them maintainers push forward to quit using these sysvinit scripts as soon as possible, even faster than before.
Cheap Games a Risk To the Industry, Says Nintendo President
As always, I see most Slashdot representative comments being completely wrong when it comes to Nintendo.
It seems like some people wait for every news on Nintendo to spew all the ire and BS that keep them enraged inside.
I'm still wondering where people are seeing anything about "competing". Yet, a lot of posts act like he talked about competing with these mobile devices.
Fils-Aime, who is NOT CEO of Nintendo (which should give a clue as to how misleading this news is) is talking about the content, the games, and people's perception of value on games. It has nothing to do with graphics or animations or whatever problem people have with Nintendo, it's about content and its value to the consumer. He's saying it's dangerous for the industry to make people believe that whatever the content in a game, it has a very low value (and so a very low price).
He doesn't even say it's threatening anyone right now, he says its a possible risk.
People on Slashdot, as with the DS or the Wii (or even the first iPod), are already talking like 3DS is dead on arrival and Nintendo is doomed (as always).
Yet, it's for opposite reasons. $2 games would kill most HD console games types faster than any Wii or DS ones.
I think what Reggie said is more a warning to 3rd parties. We see lots of 3rd parties porting their already (or not) profitable games to mobile platforms, for a very cheap price, and sometimes it's exactly the same content. The problem is that a consumer who paid $40-$70 for a game, that sees the same one for $5 on his mobile, will not really be happy about ever paying the high price again. And if lots of them come to the same realization, you will see mostly western 3rd parties die left and right (worse than what we see today). Because most western 3rd party games (and lots of eastern ones too) rely on selling a lot at launch, and then quickly die. If they don't sell a lot at launch, you see the games with quick slashed prices. 3rd parties can do that because the cost of the game is already taken care of by the other platforms, but there are very few people (the hardcore) that will want to pay so much more to have the privilege of playing the game day 1.
Nintendo doesn't have this problem as their games sell with veeery long tail. Basically, 3rd parties (the big one) are shooting themselves in the foot if they go on, and less 3rd parties alive to make games for Nintendo platforms means less revenue for Nintendo.
Apart from that, Reggie is not really complaining, not yet at least, as even the DS is right now more threatened by the 3DS than by any mobile device with games.
They're not in direct competition like computer games (like flash games and all that existed for a long time) are not competing directly with console games (except for devs resources). Or that would mean consoles are winning...
Fedora 15 Changes Network Device Naming Scheme
I agree partly with you.
It's true that consistent interface naming has just worked for years with multiple interfaces.
The author is talking about races but they are not explained in his paper and are marked as thinks since 2009.
What this basically boils down to is that now distributions can streamline the naming of your interfaces without you having to go test them the first time you install your OS to know which is which. You'll now know immediately from your OS.
This shouldn't upset anyone, except people that can't accept change. It will be a matter of accepting now to use lom0-X or em0-X instead of eth0-X.
People like me who are already using udev rules to change their network interface names won't be affected anyway, and you still can name your interfaces like you want as udev is still what renames these interfaces.
Fedora 15 Changes Network Device Naming Scheme
As has been mentioned above - you can tell from your comments you are just a home user.
Try writing OS-level software (stuff that is imaged onto the device) that depends on the NIC in position 1 on the PCI bus always being the management interface (IE, the first port in the chassis). Remember this software has to know IN ADVANCE which port it is, you can't use the MAC for this at all.
This has been a problem in Linux for years and one that developers have always had to hack and slash around. I am glad RedHat is finally fixing it. Hopefully other distributions will follow suit.
I don't understand what you're saying. If you wanted to do that, you could do it a since years with udev IN ADVANCE of your software you want to launch.
Since YEARS udev has put persistent device naming so that what is said in the article is wrong, because you will always get the same name for your devices.
Since years, I actually have my Ethernet interfaces which are not named eth0-X but sth else, which is based on any arbitrary value you want that is provided by udev, including PCI slot and what not.
So what exactly are you talking about?
The sole difference I see there is that the distro will now do it upstream instead of you having to put your udev rules.
Which seems to be a good thing since people like you who are not mere "home users" apparently don't know about these features and have missed these functionality for years. What's worse is that you're saying it's a problem with "Linux" whereas the sole problem comes from you being ignorant of the solution.
It's strange too that you talk about this being a problem with Linux and devs having to hack and slash, when the solution couldn't be done anywhere before because the specs weren't out (SMBIOS 2.6, now exposed through linux kernel since 2.6.36 some weeks ago, or ACPI or via biosdevname) and the solution proposed here is still the exact same kind of hacks you're talking about.
FWIW, the sole difference here is the naming convention, as the solution is still implemented via udev rules, with the exact same mechanisms as before, mechanisms used for YEARS with the same hacks.
The main difference compared to before, is that now the naming scheme can be streamlined by a distribution, while before it couldn't, because it would have no consistency.
Nintendo 3DS Launching On March 27 For $250
What's not in the summary or TFA is that this is the first handheld to be "fully" region locked. The PSP was region locked for movies, while the DSi had region locking for the online stores.
Actually, DSi games were region locked too.
But this is the first handheld where titles bought off store shelves will all be region locked. There's been evidence for some time that Nintendo are the most anti-consumer of the three console developers, but I think this is probably the final proof.
What? This has nothing to do with being anti-consumer.
But it's very bad for the low number of consumers like me, that import games and play genres that are not popular on my territory. I can't say I'm pleased with this.
I'm used to it, but I thought Nintendo understood that it wasn't making them any good.
I hope this protection will be broken.
But given my long experience of region-locked consoles, people like me are far too small a quantity for region-locking to have any impact on sales. In entertainment at least. It's even more obvious with BRD or DVD.
Combined with the console's price-point, this really does make me wonder where Nintendo are going with this. They've put it at a price tag which, like the PSP, is going to put it out of reach of most of the playground demographic, at least until Christmas.
The launch units are obviously for enthusiasts.
And yet among non-Japanese grown-up gamers, one of the biggest uses of handhelds is for when you go travelling. I'm not going to sit at home and play on a handheld, in general, when I have proper consoles and a gaming PC in my flat. Why should I peer at a tiny screen and cramp my hands up for a handheld's controls when I could be gaming in comfort? And my commute? I suspect that like many people who live in or near a major city, my commute on public transport is just too crowded and too rattly for handheld gaming.
Clearly you're part of a tiny fraction of the people that bought a DS/DSi/DS XL:
- lots of non-japanese grownup gamers bought the DS,
- the DSi XL is made precisely for people that sit at home and play on their handheld,
- most children are not playing their DS only when they commute,
- people who commute in Japan play the DS even when it's crowded,
So basically you're not even the target for the DS from what you describe, so it's clear why you can't give a proper prevision on the success or failure of 3DS.
Final Fantasy XIII-2 Announced
You're misreading what I said.
Like it or not, FF13 was starved of resource by Square-Enix. But as any project manager will tell you, there is more than one kind of resource. FF13 had plenty of budget. It had no shortage of artistic talent. But it was deprived of the company's core games development talent and of any sensible kind of project management. Go read the interviews that followed FF13's launch, when Square-Enix realised it had a turkey on its hands and began the blame game (which we've seen even more pronounced on FF14). The game had a huge number of artists working for many years to produce assets for the game - artists who just aren't needed for the low-budget graphically primative handheld and Wii games. What it didn't have was anybody putting work into developing game mechanics or even a storyline to hold the game together. This is why we got a game that was graphically beautiful (on the PS3, at least), but which just did not work as a game.
I agree with that, but still it has nothing to do with DS games depriving HD games of resources.
This is nonsense and just plain false, as the HD engine for FFXIII and other SE games was being developed at the same time. DS game developers at Square-Enix didn't deprive HD game developers, this doesn't make sense, and sure didn't deprive DQ IX of resources.
Meanwhile, the people who knew how to design games were off doing stuff like 356/2 Days on the DS. Now sure, those games have some pretty neat gameplay elements, but they are always going to be constrained by the limitations of the hardware. It's not just graphics; a lack of RAM in these systems constrains the size of the play areas you can use and so on (hence the mission-based structure that a lot of these games tend to take).
So this can't be games that deprived FFXIII of game developers.
The results of Square-Enix's strategy have been plain in the performance of their games lately and their financial results for the last year or so (for which see google). The handheld and Wii games get ok-ish reviews and do not exactly set the charts on fire in terms of sales (they tend to do ok-ish in Japan and underwhelmingly in the West); they don't cost much to develop, but they're not exactly setting the world on fire.
I agree with that, except for DQ IX, they are crap games. This is the same for PSP games. Actually, you could say the same for most SE games released this generation, HD games included, except for DQ IX on DS, so I don't see the point here.
And no, no SE Wii game received OK-ish reviews.
At the same time, the big-budget main-series FF games take forever to develop (remember, no effective project management) and get panned on release. If I remember, FF13 had pretty decent initial sales, but these fell off a cliff as word of mouth basically torpedoed the game below the waterline.
Receiving 39/40 review from Famitsu is being panned on release? What nonsense is that?
FF13 was just far more frontloaded than any other FF main numbered release in Japan, that's all. It's first week amounted like 85+% of its LTD sales which is just insane.
In short, Square-Enix does need to put its resource focus back onto its big-budget AAA titles; but by resource, I mean development talent, not money.
What I can't accept is that I think you're saying that they put talent in their DS or Wii games, which I hope and believe, is completely false. If it were to be true, given how bad their games on these platforms are (except for Yuji Hoori Dragonquest of course), then Square Enix won't be able to pull out of the rut they're in nowadays.
As for Japanese gaming falling behind the West; wake up and smell the coffee. It's clear you're a Nintendo fanboy - and one of the minority who hasn't been through the disillusionment process yet. Don't worry, it's not necessarily a permanent condition; I was a Square-Enix fanboy until the last couple of years cured me.
It's clear lots of people in the west have bolders concerning this subject, it's clear you have too, and it's why what I'm saying is not popular here on Slashdot (despite what I'm saying being facts), which is very hardcore focused.
You know what, I feel exactly the same as the year before the Wii launch and the year after on Slashdot. People were as blinded as they are now. But I'll read your argument as to why I'm sleeping, surely I must have missed some things.
As a games developer, Nintendo have fallen comprehensively behind the West (and have now realised this and are trying to catch up; witness Metroid: Other M, though I wouldn't categorise that game as a success). They've fallen into another common Japanese gaming trap; failing to identify which elements of their old titles to preserve and which to discard. Hence we still get the antiquated lives system in Mario Galaxy 2, and hence we still get the same damned plot over and over in Zelda. You may like it, but the rest of the world is moving on.
For now you only show you had a knee-jerk reaction to me saying Nintendo puts western games to shame.
I don't own any Mario Galaxy games (but I'm supposed to be a Nintendo fanboy alright), as they have no appeal to me, so I'd be hard pressed to comment on it. I agree that Nintendo has made a huge mistake with Metroid Other M. Yet, just taking the real success (and not your straw man successes) like NSMB Wii which I actually bought, it shows you're partly wrong.
Nintendo's market these days are nostalgic 40 year old neckbeards who don't really like games, and new-entrants to gaming. I suspect they're not getting much in the way of repeat custom.
What does that mean?
Still, as I say, Other M (which does try to adapt elements from Western gaming in a fairly major way) is a first sign that they have, belatedly recognised this and are trying to adapt. Sure, Other M isn't great in itself, but it's a sign that there's hope for them.
I don't know why you insist on talking about Metroid Other M which is far from being a success for this company. Why don't you talk about Donkey Kong Country Returns instead ?
There's hope for Nintendo you say? Reading you, I would never believe Nintendo is the company that defined this gaming generation, both in hardware and software.
Still, it's unfair to harp on Nintendo. Other Japanese studios have been just as guilty of failing to adapt to the current generation; even those who had some early successes. Look at Sega; they put out the sublimely good Valkyria Chronicles, which was one of the absolute stand-out games of the current console hardware generation, which married artistry and technical prowess perfectly and which managed (almost uniquely for this console generation) to match the quality of what the likes of Bioware and Bethesda have been doing.
And then look at how they blew it. Sure, the original game is still out there and, in fairness, it is as good as ever. But they've now shifted the series squarely onto the PSP, where it is hampered by the poor specs of the handheld, forcing it to reduce the scope and scale of its battles, and to replace the beautiful cutscenes of the original (which complemented the gameplay perfectly, rather than overwhelming it) with a bunch of still images.
Did it occur to you that the sales were so poor that they couldn't sustain such a game on HD consoles?
Valkyria Chronicles wasn't even localized for the european countries, which should give you an indication as to why they switched to PSP. I still think that was a mistake, as they could have released it on the Wii, which has a far better exposure to the west than the PSP which was already dying there.
Capcom, one of the sole studio standing its own this generation (not posting losses after losses), put Monster Hunter 3 on Wii precisely because they saw they couldn't sustain the development of such a game on HD consoles. SE stubbornly went all out and now they're in trouble. I agree that most studios in Japan just were not ready for HD content. They needed to go there more slowly, and now they're all in big trouble.
It's not that I'm biased against Japanese gaming. Really, in the last console generation, if you took my top 5 console games, there'd only be one Western game in there (Knights of the Old Republic). It's just that the Japanese gaming industry is going through a really sick patch right now and needs some urgent interventions to turn it around if it doesn't want to become an irrelevance on the international scene.
But what I'm telling you is that it's the exact same in the west.
Bizarre Creations (under Activision, the sole western publishers doing well this generation) is just being shut down as we speak. So you think all that is better?
Western studios are closing left and right.
You're saying it's better to leave everyone behind in technics and graphics and then close shop (like lots of eastern and western developers).
I'm saying it's better to sell lots of games that people actually want to play and feel are fun, and be hugely profitable to go on (like Nintendo).
Activision/Blizzard is the only one being successful and catering to your needs of technics and graphics, and I just think that's because this model is just plain unsustainable for more than one player. All the big western publishers have started going the Activision way last year. I think they're precipiting their death, but we'll see.
THQ nearly died with this strategy, they've come back with uDraw which is their biggest success lately.
Technics and graphics are not what's driving the entertainment business, and Nintendo is the sole proof of that with their never-seen-before profits in the game console business, despite having a SD console and a handheld.
Final Fantasy XIII-2 Announced
The high end titles have suffered (13 and 14) because there has clearly been a lack of development focus on them. It's clear that Squenix's emphasis has been on bad-to-middling handheld titles, like the (entirely pointless) Dissidia games, the Kingdom Hearts handheld titles and rubbish like Crystal Chronicles on the Wii. The company was doing just fine right through to FF12 (which was difficult to get into, but pretty awesome when you did).
Then you must have an agenda, because the problems of FFXIII have absolutely nothing to do with technic. They have to do with game direction, which is completely independant of which console you develop for or if you master every bit of the console or not.
Basically, you're coming into this thread talking about problems that aren't there (at least for FFXIII).
The lack of town in FFXIII has nothing to do with SE having Wii, DS or PSP devkits, or with the games they made for these platforms.
And the Enix part of the company is doing just fine on DS, with Dragonquest IX being (for now) the best 3rd party sales ever on a game platform in Japan (more than 4 millions sales in Japan alone).
Contrast that with the fact that FFXIII is the main numbered FF (which is not a MMO, aka traditional FF) with the least sales in Japan (less than 2 millions sold in the last Famitsu top 100 released recently).
If SE followed your advice, they would be dead by now. FFXIII was in development for a loooooong time and cost a LOT of money.
It really only is with the advent of the current hardware generation that their output has gone to hell.
It's symptomatic of wider Japanese gaming, I think. Outside of a few exceptions, Japanese developers have never really got to grips with the PS3, 360 and the modern PC in a way that the West has. As a result, I think Japanese console games now lag behind their Western counterparts to roughly the same extent that they led them by in the PS2/Xbox/Gamecube generation.
So the last sentence was your hidden agenda.
Your last sentence is wrong BTW. Nintendo alone proves you wrong on all counts.
Japanese console games are far beyond their western counterpart just by counting Nintendo alone.
The main problem of this generation is money, greed and graphics.
This generation, western console games look like they're more advanced (I didn't say better) because it's more occidental to put lots of money on the table to do grandiose games. Japanese are far more conservatives. The problem is that there's no market to sustain these games (except on Wii and DS), and the consequence is that dev studios are dying left and right, and those that are not dead yet are posting losses after losses every quarter. Lots of big publishers died or are dying this gen.
The only reason why japanese console games seem to lag behind is because they at least saw a little better the obvious outcome : their death if their game doesn't work.
And it's symptomatic of most publishers (both western and eastern) this gen : not supporting the market leader with grandiose products. The writing was on the wall since 2007 really.
For Square Enix and Final Fantasy, I guessed the outcome in 2007, seeing how they were handling FF and DQ, with FF getting all the push by SE, leaving DQ behind, but I was sure Dragonquest was the one that would come to the front and survive, if only just because Yuji Hoori was doing the right choices (like putting his next DQ on the leading platform as always, which was the DS). While the Square part of the company was doing nonsense like putting FF on the loser consoles just for "the graphics". This showed right away that FF was going in the wrong direction. The MMO FFXIV only confirmed this fiasco in a spectacular way.
It's sad really, when Xenoblade is a better FF than FFXIII, but it doesn't have the brand name to sell as much, not even a tenth of what FFXIII sold. FFXIII is still a financial success I think, though not as good as SE hoped I think, as they're already in trouble, despite FFXIII and DQ IX last year.
The 5-Year Console Cycle Is Dead
The Wii has only one processor core. The Wii has a GPU capable of only ~15 million polygons/second max, and incapable of plain old bumpmapping, nevermind more complex shaders. It has a pitiful amount of memory available. Reducing the resolution of a 360 or PS3 game doesn't reduce the massive amount of shaders and effects the Wii simply could not handle. That's why games need to be completely independently developed for the Wii, it's nearly impossible to do a straight port and downgrade, simply because the limitations are so vastly different. It's a Gamecube. Surely you're not suggesting that a PS2 could play PS3 games easily at 480p as well?
Which just shows you're ignorant. PS2 DOES play "PS3" games at 480p, like some football games (PES/Winning Eleven) and some beat'em all games, and some others. Even some movie franchises games.
SO yes you're wrong.
There is no technical reason why the Wii can't have the same games as the HD consoles, except the games wouldn't be in HD of course.
Your technical nonsense is just an excuse, which was debunked a while ago.
The 5-Year Console Cycle Is Dead
Nintendo seems to be the only one that needs to upgrade the capabilities of their current console. There's lots of games coming out for PS3 or XBox360 that I'd like to play, but these games are not coming out on the Wii because it's simply not powerful enough.
Nintendo doesn't need to upgrade anything because they're not in the technology business. And the reason why these games are not coming out on the Wii is not because it's not powerful enough. It's just an excuse, as shown when a well-known game engine (allowing these games) wasn't released for the Wii for this very reason, and yet comes out later for the iPhone, which has even lower resolution. How stupid is that?
This excuse does not work anymore just looking at the Gamecube, which, despite being more powerful, didn't get the games either.
The 5-Year Console Cycle Is Dead
I think the larger danger to the consoles is not the PC market, but the mobile market with the iPad and such. I've been surprised at how much the iPad can actually pull off for not being just a gaming device (N.O.V.A., etc).
This article reminds me a bit of some of the early predictions where the people couldn't see the need for more than a few computers in the world. It reeks of something that will come around and bite them in the ass for not progressing quick enough.
I've heard this countless times, and I don't believe one bit of it.
Like when the Wii launched, and I saw Slashdot collectively being wrong on everything about it, like the name "Wii", that thread was very virulent and stupid.
All of this comes from the same mistake, every single time : people believe that consoles are in the technology business, while consoles are in the entertainement business. I understand why people make this mistake especially on Slashdot, but still.
Th iPad and before it the PC, would be a threat to home consoles if consoles were in the technology business.
But now just put them in the entertainment business. To help you, assume consoles (games for consoles actually, talking about consoles is wrong already) are movies. Now look at these movies, and evaluate the amount of fun between watching them in your living room, in your bedroom alone on your computer screen, and on the move. And perhaps you'll understand why the iPad, or PC are not a threat to dedicated home consoles. At least I'm convinced they're not.
But people rightfully feel consoles face competition from these devices, because 360 and PS3 are looking more and more like PC, opening themselves up to competition from other PC derivated devices like the iPad.
The 5-Year Console Cycle Is Dead
The Wii just appealed to the casual gamer grandmas who would have never considered console before. The only reason it sold so much is because it opened a new market that consoles could previously never break into. It was also relatively cheap, further lowering the bar to its entry into the market. The 360 appealed more to the traditional console crowd. Most serious gamers I know have 360s. Not many have a Wii.
This is plain wrong on two counts :
- 360 mainly got PC games, so is not appealing to traditional console crowd at all, but more to traditional PC crowd, who must have heavily migrated. This just accelerated the demise of PC gaming. Only the true hardcore are still on PC now, the "mainstream" PC gamers (so mainly USA gamers) seems to have migrated en masse to 360.
- "serious gamers" doesn't mean anything, and if you meant hardcore gamers, obviously they play all the best games on all consoles, so only the ones that have a Wii are true "serious gamers". The others are just fanboys or kids if they claim anything about being "serious gamers".
I don't know either why you're refering to the Wii in the past.
Alternative To the 200-Line Linux Kernel Patch
Best for you right now? Update your .bashrc file.
Best for all the people who miss this little nugget? Include it in the kernel.
This is nonsense.
Best for all people who miss this little nugget? Include it in the Linux distributions.
This is what they are for BTW.
Alternative To the 200-Line Linux Kernel Patch
Umm no. It will be in the default kernal eventually and that works out-of-the-box. The idea that user friendliness is "pasting some lines to bashrc and running some commands" and that "user friendliness" should be left up to distros rather than the main line for Linux is pretty much one of the reasons Linux has never really mattered on the desktop and why 95% of computer users prefer Windows or Macs.
This is pretty irrelevant here because the people affected by this patch and benefiting from it are all deeply involved in Linux already.
What you say is even more irrelevant because this patch won't affect in anyway those users that prefer Windows or Mac (users used to graphics and not to command line) if they ever happen to try Linux.
To add to the ridicule of what you're saying, the kind of workload causing this scheduling problem (the heavy stress on the CPU scheduler) are just unsustainable on both Windows and Mac.
The problem here is clear to me : this is a userspace problem that they include in the kernel. It's a userspace problem because you don't actually need one addition of this kind to make it effective, you need one for every workload you happen to fall upon. Which means it must be configurable, and if you put that in the kernel, you then have to put the myriad other workload corrections in the kernel, or you'll have inconsistencies, with some workload treated in the kernel, others in userspace.
This is a policy problem basically, and I just don't understand how the kernel developers can say it's sane to put this in the kernel.
Of course one patch of this kind is not bloat, but as soon as you start adding the corrections for other workloads, it'll add huge bloat to the kernel. This doesn't make sense to me.
Windows Cluster Hits a Petaflop, But Linux Retains Top-5 Spot
That's a bit excessive! It does have some nice things going for it, including a fairly nice API that's been binary and source compatible for decades.
This is no advantage in this case as there are decades old API which are nicer, standard, and as a bonus have open implementation (like MPI).
There's end-to-end Unicode support in all APIs, a nice event logging and tracing system, a nice performance monitoring system (WMI), various asynchronous file and socket APIs, including advanced copy-less APIs that can tie TCP streams to specific CPU cores, etc...
All of which are pretty much useless in HPC, and when they are useful, they're already there in better form on Linux. A nice event logging and tracing system, really? When people want maximum computing efficiency, they don't want all that.
Unlike Linux, Windows has a built-in volume snapshot system that supports application quiescing (not just cache flushing), exportable snapshots, advanced access-list support that is standard and consistent, etc...
All of which are useless in a cluster. If Windows snapshots are exportable to Linux LVM, this is a good thing. The same if Windows ACL support is standard with the POSIX ones, or at least the one in Linux.
I didn't know that, but at the same time who cares?
Really, the biggest issue with Windows is that the source is closed, so if you need something special for a cluster, you're out of luck. "patch tuesday" is only an issue on networks which are not controlled, and a supercomputer would use a dedicated, isolated network.
Which is nonsense because a FS like Lustre is still updated constantly, and often need tweaking to a particular hardware.
And no, the biggest issue with Windows is its efficiency. It's one of the worst ones in the Tsubame 2.0 system at Rmax being 52 % of the Rpeak. In part due to the use of GPU, but still. Even the first on the top 500, that also uses GPU, is at 54 %. Linux can go in efficiency to as high as 87 % and more.
Mega Man Designer Explains Japan's Waning Video Game Influence
Japan, seriously, how many times do we need the protagonist to be a 13 year old boy with no fashion sense and spiky hair? Also, would it kill you to have the story make some goddamn sense for once?
Perhaps you should play more japanese RPG, instead of always playing the same one.
At least that's what I get from what you're saying.
Seriously, when I find out that the main character is the dream of a ghost and the answer all along was that we needed to combine all of the feelings of love throughout the world to break the time loop or something I just want to kick the writer in the nuts.
Oh god! I hope you don't play games only for the story though, or you missed a lot from this game including the essential part of the story actually, you only remembered the ending! Must have been a lot of bore throughout that game. Remember: games are about having fun, not looking at a story, which is for movies.
That's why I tend to prefer western RPGs, even if they do spend way too much time stealing ideas wholesale from Tolkien, again. I'd love to see more studios go the Mass Effect or even Alpha Protocol route just to freshen up the genre.
Good for you. I'm actually more and more disappointed by JRPG, the classic ones. And that's mostly because they become more and more like movies, which is not fun in a game (lots of cutscenes etc.)
But SRPG (mostly a japanese genre in RPG) are getting better on the other hand, and that's a good thing for me as that's my preferred genre.