×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

PC Games Go To Boot Camp

Zonk posted about 8 years ago | from the suck-in-that-gut-civilization dept.

90

1up has taken several of the more popular recent PC titles to Apple Boot Camp, and report back on how they handle the MacBook Pro hardware. From the article: "With all settings on medium, F.E.A.R. is absolutely playable. Again, none of the silky-smooth 60 fps that hardware freaks clamor for, but it looks good and plays well even with tons of characters onscreen. Annoyingly, F.E.A.R. offers a really pitiful selection of resolutions, all of which are constrained to the old-fashioned 4:3 aspect ratio -- meaning that play on the MacBook's widescreen is stretched, and kind of ugly. That's not a hardware issue so much as limited programming, and presumably anyone with a widescreen PC is in the same pickle."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

90 comments

Awww... (1)

Avillia (871800) | about 8 years ago | (#15100173)

And I was hoping for Video Game Characters going to bootcamp with hilarious and sexual results.

Alas.

Awww...All's fair in War, and War. (0)

Anonymous Coward | about 8 years ago | (#15100869)

Well the headline made me think of America's Army (the game). They're suppose to be using the Unreal 3.0 engine.

Hmm (2, Insightful)

NIK282000 (737852) | about 8 years ago | (#15100197)

Nice article but I dont know why any one would want to game on a laptop. With the screen and keyboard so close together thats a back problem waitign together. I would like to see how the mac desktops size up adainst say a dell or HP desktop.

Re:Hmm (2, Interesting)

allenw (33234) | about 8 years ago | (#15100267)

That's an easy one: travel.

It's greating being able to pop open a laptop in the airport, on the plane, etc, and have a nice relaxing game of whatever. Especially when you are stuck in some hick town with no social scene at all. If I have to take my laptop anyway, I might as well get some use out of it other than doing a presentation or whatever.

[My biggest complaint are the games that require the CD/DVD to be present when they don't actually pull anything off of the media or require it for the audio track that I turned off anyway. Sure, there are lots of tools to get around this, but it is still annoying to have to do those extra steps.]

Re:Hmm (2, Informative)

99BottlesOfBeerInMyF (813746) | about 8 years ago | (#15100347)

Nice article but I dont know why any one would want to game on a laptop.

LAN party. You know, a dozen guys and gals go to someone's house. We usually have about three desktops and about nine laptops for a typical night. Who wants to lug a desktop and a monitor over to a friend's house? Just buy a USB keyboard (maybe a gaming keyboard), plug into your laptop and go.

Re:Hmm (1)

Robotech_Master (14247) | about 8 years ago | (#15100402)

Plus, sometimes you just wanna get out of the house. Go down to the Internet coffeeshop and game there, or to the LAN-gaming place but use your own computer that has all your custom macros on it--most LAN-gaming places won't let you put that stuff on their computer.

And there are also those folks who can't afford or don't have access to high-speed Internet in their area, so taking it on the road is the only way they can do it via high-speed at all.

Sucky Resolution Support (1)

NekoXP (67564) | about 8 years ago | (#15100215)

I find it sickening that modern games do not support what should be standard screen resolutions.

All console games these days have widescreen support. It is not hard to do.

In this HDTV age, why don't games support the standard HDTV resolutions, too? 720x480, 720x576, 1280x720, 1920x1080 - it's not hard is it? How hard is it to populate an array with some other options?

Half-Life 2 supports it (0)

Anonymous Coward | about 8 years ago | (#15100319)

I was pleasantly surprised to find out Half Life 2 had both 16x9 and 16x10 modes when I got a Dell 24" monitor. Sadly, my old video card isn't powerful enough to run it at full resolution.

Re:Half-Life 2 supports it (1)

NekoXP (67564) | about 8 years ago | (#15100962)

Yeah I have HL2. It's great apart from the chat font in the Steam UI being way way way too tiny to read from 6 feet away on a CRT HDTV. And having no way to change it except for hacking resource files..

Re:Sucky Resolution Support (4, Insightful)

Onan (25162) | about 8 years ago | (#15100531)


Well, even beyond that, why would you possibly use a hard-coded list of specific resolutions, however long?

As soon as you support more than one resolution, you (or your libraries) already need to handle scaling and talking about your polygons in portion-of-display units rather than number-of-pixels units. That work is already done, so why limit yourself to any number of specific resolutions, rather than just scaling to whatever pixel count and aspect ratio the display happens to have?

Do you really think that you can predict now the specs of every display that any person is ever going to use to run your game at any time in the future? This is nearly as absurd as people who chain their website design to absolute numbers of pixels.

Re:Sucky Resolution Support (1)

Chris Pimlott (16212) | about 8 years ago | (#15100555)

Some do; most id Software engine games (like Doom 3, Half-Life 2) let you manually set any resolution and aspect ratio you wish. The only catch is you have to do it from the console (or in the config files directly). The menus still have a limited number of pre-set resolutions.

Re:Sucky Resolution Support (2, Interesting)

Babbster (107076) | about 8 years ago | (#15100747)

A friend and I were talking about this very issue recently. While I tend to agree that PC games should be entirely flexible in terms of resolution (since there are far too many display options and aspect ratios available), I realized that there was one factor which could be important to a game developer: Preserving the cinematic intent of the game. For example, if a game is supposed to surprise you by attacking from behind, it can't really have a third-person viewpoint available. The same could be true in a 4:3 versus 16:9/16:10 situation in that the level/game designer might want to constrain the viewpoint to 4:3 in order to cause a sense of claustrophobia while enemies are off to the sides just out of vision.

In a similar vein, I could see where some people operating with their 17" 4:3 screen in a multiplayer "twitch" environment could be upset that their opponent is getting a much wider view on their 23" 16:10 screen. Then again, that kind of issue has been in play for a while now with the potentially large disparity between video cards/monitors and their available resolutions (i.e., someone at 1600x1200 on a 21" screen is going to be able to see better than someone at 800x600 on a 15" screen.).

Of course, if the only reason a developer puts limitations on resolution is because of bad programming, then that's no kind of excuse. :)

Re:Sucky Resolution Support (2, Insightful)

n8_f (85799) | about 8 years ago | (#15101193)

I realized that there was one factor which could be important to a game developer: Preserving the cinematic intent of the game.

If that were the case, then they would leave the resolution set to what it is (preferably native, but that is the user's choice) and just use a 4:3 chunk in the middle. Instead, they change resolution to their 4:3, non-native one and leave the screen looking like crap. If they cared about the quality of the experience, they've just ruined it far more than allowing a widescreen view would have. There have been widescreen monitors now for over half a decade. At this point, it is just lazy programming.

Re:Sucky Resolution Support (1)

NekoXP (67564) | about 8 years ago | (#15100998)

Having been involved in game design projects before, yes I do think it's possible. Let's put it this way: there are 3 major screen ratios in the world, 4:3 (TV, CRT), 16:9 (the new widescreen standard) and 16:10 (some computers-only bastardisation to keep LCD costs lower or something)

In most games you just render off to the side a little more. You space out your HUD. Since the viewport in 3D games is set out by 2 or 3 procedural functions, this is very very very trivial coding.

Why use a fixed list of resolutions? That is down to usability and user friendliness. If you listed every resolution that a graphics card claimed to support, or a monitor with broken DDC supported, or some variation on a theme, there would be a 100-entry list just for the resolutions (a great example here; go get Black and White 2, which asks DirectDraw what resolutions it can support. If your DDC monitor works, it shows all the resolutions it DOES support. If you are using Component output on a HDTV or so, you are presented with a 62 item list of very esoteric and probably unsupported display modes).

Most games present a preset or filtered list with the most common screen modes in, so that you can pick out the one you want.

Since I know Half-Life 2, Doom 3 (after a bunch of patches) and Quake 4 (also after a bunch of patches but broken in the latest patch in a potentially monitor-damaging way. I wonder how the f**k they did THAT though??) can do it, I wonder why other games companies can't.

Re:Sucky Resolution Support (1)

99BottlesOfBeerInMyF (813746) | about 8 years ago | (#15100585)

I find it sickening that modern games do not support what should be standard screen resolutions.

It will probably upset Mac gamers even more than most. Since such a large percentage of Macs are widescreen, I don't think I've ever seen a Mac game that did not support them. Also, many Mac users love to bitch about the Windows platform, in general :)

Re:Sucky Resolution Support (1)

great throwdini (118430) | about 8 years ago | (#15101396)

Since such a large percentage of Macs are widescreen, I don't think I've ever seen a Mac game that did not support them.

There are plenty of OS X native games that don't support widescreen. The last one that I personally played was Tropico 2: Pirate Cove to give some idea of how recent a game can be, yet plagued with this issue out of the box.

Re:Sucky Resolution Support (1)

99BottlesOfBeerInMyF (813746) | about 8 years ago | (#15106596)

The last one that I personally played was Tropico 2: Pirate Cove to give some idea of how recent a game can be, yet plagued with this issue out of the box.

Gee, and it's made by Microsoft too, what a surprise. I stopped buying MacSoft's crappy games long ago. They are always unstable and poorly done.

Re:Sucky Resolution Support (1)

UnknownSoldier (67820) | about 8 years ago | (#15100826)

> All console games these days have widescreen support. It is not hard to do.

Technically, no, but artistically, it is, in order to do it right.

It's about providing a UI that looks good any at resolution.

It's much easier to make a UI look good at 4:3, then to do "double" the work to support 16:9 or some other "oddball" configuration.

Yeah it sucks, but as a programmer I can appreciate the amount of work an artist has to do.

Cheers

Re:Sucky Resolution Support (1)

PygmySurfer (442860) | about 8 years ago | (#15102087)

They could at least support the resolution, and just stick it in the middle of the screen. I think most widescreen gamers would be happy just to not have the 4:3 image streched to 16:9 or 16:10. The problem is not so much that it's not widescreen, just that its all distorted.

Re:Sucky Resolution Support (0)

Anonymous Coward | about 8 years ago | (#15105598)

There is a simple .ini tweak that allows you to set FEAR to run at any resolution you choose... The game automatically recalculates the FOV and all other required math...

Seems like they anticipated going outside 4:3, but then just neglected to include the options in the settings window.

You have to be careful, set all your other settings in the in-game options, then quit and edit the .ini manually, set it to read only, and then reload the game... Problem solved, although I agree it should just do it out of the box.

Kind of offtopic... (1)

spxero (782496) | about 8 years ago | (#15100234)

... But why should the widescreen folk have a better view than the 4:3 folk? Imagine playing a game online, and you have a 4:3 screen. It's great, it looks good. But then someone else you are playing against has a 16:9 widescreen and he sees not only what you are able to see, but more (on the sides). So his 'character' has a better peripheral vision because he has a widescreen monitor?

Having the widescreen stretch the view out seems like less of a programming issue and more of a gamer-fairness issue.

Re:Kind of offtopic... (2, Informative)

Krach42 (227798) | about 8 years ago | (#15100302)

... But why should the widescreen folk have a better view than the 4:3 folk? Imagine playing a game online, and you have a 4:3 screen. It's great, it looks good. But then someone else you are playing against has a 16:9 widescreen and he sees not only what you are able to see, but more (on the sides). So his 'character' has a better peripheral vision because he has a widescreen monitor?

Blame the industry for lack of foresight, meanwhile, me and my widescreens will enjoy the extra peripheral viewspace.

To note though, I have a PowerPC Mac with widescreen, and got the Doom3 demo, and I had to bump up the POV in order to not get a "stretched" image. Meaning the resolution was widened but the angle of view was still the same as an unstretched monitor.

In this case, everyone is able to set the POV to the same values that I am, and the Doom-engined games have long allowed servers to restrict POV ranges, since people could set these to very high values, adjust to them, and thus end up being able to see out the sides of their heads.

Having the widescreen stretch the view out seems like less of a programming issue and more of a gamer-fairness issue.

If one is actually concerned about this "fairness" issue, then Macs have offered for a long time a resolution where it is not stretched, but rather the standard resolution centered in the middle of the screen. This looks a HECK of a lot better than a stretched resolution, where people look fat, and distorted.

Also, again the same point as above, anyone can adjust POV angles in the games that support it, so if you're willing to deal with a distorted image, you can have the same POV range as I do.

EAX and Multi-Channel Audio (1)

damacus (827187) | about 8 years ago | (#15100448)

Problem with extended peripheral vision ? How about surround sound? The gap is already there. Someone wearing headphones or using standard 2(.1) channel sound is at a disadvantage against someone using 5.1+ who can literally hear their opponents' footsteps behind them.

Re:EAX and Multi-Channel Audio (1)

Krach42 (227798) | about 8 years ago | (#15101079)

True. In my LAN gaming group, we used to play with open speakers for all to hear. It was common habit that if you were searching for someone, you would jump, and listen for their speakers to make any noise.

This worked well to your advantage until people started bringing surround sound systems and could target you based on the 3D positioning information afforded them. Sure you know that they heard you jumping, but now they know which direction you're in, and you don't.

Of course, all that stopped once we started bringing just headphones to the games. It turned out that the biggest advantage over even being able to positionally locate someone, was to not even let them listen in to what you were doing.

It was kind of like an arms race...

Re:Kind of offtopic... (1)

Teh MegaHurtz (954161) | about 8 years ago | (#15100872)

Is it an advantage? Sure, of course it is. Is it an UNFAIR advantage? I would say hardly. I spend good money on high end gaming hardware (video card, monitor, sound and internet connection), I pay (and dearly) for the little advantage I do get, but because the other players may not be able to afford the same level of hardware, I don't see how that makes things unfair. If I was to race a Ferrari with my Honda, does the Ferrari driver have an unfair advantage? Of course not, he paid for the advantage that he gets. I still play the same game as everyone else, I just take advantage of every benefit that I can out of the game. And as far as F.E.A.R. goes, I played it in 1680x1050... while the game may not support it natively, you can force most any game to play at a nonstandard resolution. http://www.widescreengamingforum.com/ [widescreen...gforum.com] is my best friend when it comes to games that don't support the res that I want to use.

Re:Kind of offtopic... (0)

Anonymous Coward | about 8 years ago | (#15100328)

and people with faster computers than you have less lag and it isnt fair. oh well.

Re:Kind of offtopic... (1)

MyDixieWrecked (548719) | about 8 years ago | (#15100393)

The graphics shouldn't stretch. Quake3 doesn't have widescreen support, per se. When I play Quake3 on my Dell FPW2005 or on my Powerbook, it puts black bars at the sides. it doesn't stretch and distort the view.

It's a matter of properly programming the video code to compensate for strange resolutions. ...spike

Re:Kind of offtopic... (3, Insightful)

99BottlesOfBeerInMyF (813746) | about 8 years ago | (#15100459)

So his 'character' has a better peripheral vision because he has a widescreen monitor?

Imagine a gamer with a great video card and monitor. With the better resolution and size he can make out objects that are further away. Shouldn't all games be restricted to 640x480 and at a certain size on the screen, otherwise some characters can see further and in better detail than others. Some people might have two monitors allowing them to reference a map, IM with other players, or view cheats at the same time as the game. Games need to detect and turn off multiple monitors. Also, some gamers use joysticks and trackball setups that allow them to click buttons faster. Games should only support standard keyboards and mice; lest some characters have better reaction times than others.

You could argue this for all sorts of hardware, but it does not really matter. People who spend more on the best hardware and connection will gain some slight advantage. That's life. In any case failing to deal with widescreen monitors and distorting the picture is pathetic. I thought all games checked for this and at worst put some black bars on the right and left, like the ones at the top and bottom for widescreen movies on a standard TV.

Re:Kind of offtopic... (1)

spxero (782496) | about 8 years ago | (#15100840)

You're right. Those who can't keep up and get the latest equipment won't be able to play the latest games. But widescreen monitors just aren't the majority of monitors being bought. The gaming industry knows this, so they aren't forcing a move to widescreen just yet. The progressive companies are putting bars on the sides, I'll even bet that some games have widescreen availiblity (for computer). But for the most part, the majority of monitors purchased are 4:3 resolutions.

Re:Kind of offtopic... (1)

99BottlesOfBeerInMyF (813746) | about 8 years ago | (#15101162)

But widescreen monitors just aren't the majority of monitors being bought.

You're right and at the same time, not quite right. You see this article is for/by people running Macs and most Macs have widescreen displays. Aside from ibooks, I'm not even sure Apple sells any non-widescreen systems. So current Mac users (the most likely users of bootcamp) Are used to everything, including games dealing with widescreen. I've never run a game under OS X that did not handle widescreen, that I recall. It seems like gaming companies should get with the program, it's not like widescreen is rare anymore either.

Re:Kind of offtopic... (1)

spxero (782496) | about 8 years ago | (#15101360)

You're right- this article is for mac users with widescreens. And until the manufacturers decide to come down off their widescreen=$$$$ perch, I don't think many Windows users will be converting. And as long as Windows has their stranglehold on the PC market, you probably won't see a vast majority of game companies changing for the mac users. It'd be nice to see, especially if that mean monitor prices go down. But I doubt it will happen for the majority of new games coming out for quite some time.

Re:Kind of offtopic... (1)

AaronLawrence (600990) | about 8 years ago | (#15104376)

It's more than a slight difference. If gamer A has 1920x1440 resolution with anti-aliasing and gamer B has 800x600 with no AA, A will have a massive advantage. This is one reason why some gamers are able to shoot accurately from ridiculous distances.

Re:Kind of offtopic... (1)

Rimbo (139781) | about 8 years ago | (#15100519)

Having the widescreen stretch the view out seems like less of a programming issue and more of a gamer-fairness issue.

If it's about fairness, then everyone should be given free top-of-the-line PCs and high-speed internet connections. That, or you force everyone down to the lowest common denominator framerate, resolution and bandwidth. Because frames per second is an advantage in first-person shooters and people have varying qualities of hardware and network connections, they're unfair to begin with.

Worrying about the width of a user's screen seems silly in comparison to the difference a good broadband service can make.

Re:Kind of offtopic... (1)

xtieburn (906792) | about 8 years ago | (#15100584)

They used the same excuse with Starcraft and why it was limited to a terrible resolution.

One simple solution that solves the whole thing. Server side settings.

The server can determine the max resolution, the resolution types and pretty much every other setting anyone is linking up with. As long as you program that in to the interface for online gaming there should be no limits on how great you can make things look.

Re:Its unfair since the dawn of internet gaming... (1)

vertinox (846076) | about 8 years ago | (#15102244)

Having a Cable modem when everyone else was on dialup was unfair.
Having a laser mouse vs the old style mouses is unfair.
Having a computer that can run the game at 60fps vs a pos machine that runs it at 12fps is unfair.
Having a 21" monitor playing against a kid with a 15" is unfair. (Mostly because the 21" guy can see better with his eyes whil ethe 15" is having to look at less detail and may not see the other person move).

So computer gaming is all unfair like this... Otherwise I suggest a console. Or maybe a DS.... Mmmm... Tetris DS.

Re:Kind of offtopic... (1)

mrshoe (697123) | about 8 years ago | (#15103227)

The wide screen people will in fact not see anything that the 4:3 people will see. They might see more detail if their resolution is higher, but the portion of the scene that is visible to them is the same as that visible to the players using a 4:3 screen. Read up on rasterization [wikipedia.org] here to find out more.
Basically the field of view angle determines how much you see, not the size of your screen.

Re:Kind of offtopic... (1)

TheStonepedo (885845) | about 8 years ago | (#15103482)

LCDs, for the most part, do not have great internal algorithms for scaling from arbitrary resolutions to actual pixels. Particularly with laptops, where screen adjustment is done through software rather than buttons on the screen, different resolutions are displayed letterboxed or cropped. On a 1024x768 screen, one can play a game at 1024x576 using the middle 576 rows of the screen while leaving the top 106 rows and bottom 106 rows black. My 1280x1024 Sony LCD monitor displays all resolutions as actual pixels rather than scaling and making things look shitty.

Ummm...no. (1)

Shimatta1 (257977) | about 8 years ago | (#15105081)

Yeah, and some people have faster internet connections than me, but they should lag the same way I do. It's only fair, after all. Hell, some people have specialized peripherals (e.g. gaming mouse, extra keypatds, joystick, etc.); they should just be ignored by the computer, because it's unfair to those who don't have them!

Setting aside the hardware envy, game creators do need to take into account that not all screens are created equal anymore. Even without extending field of view to give an "unfair" advantage to widescreen players, they could use letterboxing (filling the extra width with black space) so as not to put the widescreen users at a disadvantage.

Shimatta1, (sole?) student of the joystick/mouse style of FPS.

I understand that running Windows on a Mac ... (2, Interesting)

kikensei (518689) | about 8 years ago | (#15100331)

is a new idea, but I don't get the hubbub. Once Apple switched to Intel, they began churning out typical x86 PC's. Yeah, they look cooler, but why would anyone expect that they would bench/perform differently from a generic white box with the same specs? This seems to be much ado about nothing. It's great that the Apple computers have the secret DRM chip that allows for OS X x8 to be installed, the dual boot option may make this a great option for for some folks. But to bench them and remark with wonder about the results compared to any of a bijillion other Intel hardware based Windows PC's seems odd.

Re:I understand that running Windows on a Mac ... (1)

99BottlesOfBeerInMyF (813746) | about 8 years ago | (#15100379)

Yeah, they look cooler, but why would anyone expect that they would bench/perform differently from a generic white box with the same specs?

The debate about whether Apple or typical PC laptop has raged for a decade. The debate about OS X versus Windows for speed has not slacked either. Now, we can actually benchmark them. They seem to be benchmarking about the same as the top of the line PCs. This is good news for Apple customers, since it means they are functional using both systems, especially for games, which is the main reason someone would dual-boot. More interesting to many of us, we can benchmark the same software under both OSs and sets of hardware to gain insight into the hardware and software bottlenecks in Apple machines and in the OSs.

Re:I understand that running Windows on a Mac ... (1)

MrJynxx (902913) | about 8 years ago | (#15100615)

The reason for the benchmarks is to prove that apple is not churning out typical x86 pc's, but ones that are made with higher quality parts which is why they cost more $$$. Maybe compare the new mac platform to Alienware. They cost much more than the bargin basement PCs you buy for $500 at your local computer store. So why is alienware significantly more money? It's because of higher quality parts MrJynx

Re:I understand that running Windows on a Mac ... (2, Interesting)

NutscrapeSucks (446616) | about 8 years ago | (#15100820)

There's been the argument that Apple is slacking on its OpenGL drivers. So, this is interesting in the very least because people can perform direct A/B tests.

Re:I understand that running Windows on a Mac ... (1)

kikensei (518689) | about 8 years ago | (#15102338)

Huh? They're not benching Apple drivers, they're benching Windows drivers.

Re:I understand that running Windows on a Mac ... (1)

dborod (26190) | about 8 years ago | (#15103706)

Classic mode (wherein one can run apps is Mac OS 9) is only available on PPC, not Intel.

Oblivion on iMac (2, Informative)

odhinnsrunes (698134) | about 8 years ago | (#15100352)

I installed Boot Camp last week, and other than some issues with some older games running too fast or not correctly measuring the speed of the processor, it worked great. I ran out and bought Oblivion, and it installed and runs great. I found the same issues as those in the article, but they are easaily resolved with some very minor tweaking. I don't really consider myself a gamer, but I was inpressed with the distance cueing limits, etc. and the frame rate was good. I was able to play four several hours and the only problem I found was that if you have anti-aliasing on the Oblivion Gates slow the framerate right down when they are on screen. Keep it on the default HDR setting and everything is fine.

Not a hardware issue? (0)

Anonymous Coward | about 8 years ago | (#15100382)

> Annoyingly, F.E.A.R. offers a really pitiful selection of resolutions, all of which are constrained to the old-fashioned 4:3 aspect ratio -- meaning that play on the MacBook's widescreen is stretched, and kind of ugly. That's not a hardware issue so much as limited programming, and presumably anyone with a widescreen PC is in the same pickle.

Weird, I never had trouble with 4:3 resolutions on my 8:5 HP f2105 monitor, I find it odd that Apple failed to include options such as the following on their wonderful hardware:
  • Stretch to fill (the default, stretches everything);
  • Keep aspect ratio (stretch until one of the dimensions is maxed out);
  • Pixel by pixel (align resolution pixels with the display's).

Re:Not a hardware issue? (2, Insightful)

node 3 (115640) | about 8 years ago | (#15101232)

Weird, I never had trouble with 4:3 resolutions on my 8:5 HP f2105 monitor, I find it odd that Apple failed to include options such as the following on their wonderful hardware:

Notebooks don't have on screen displays for LCD settings.

But ignoring that, Apple's hardware and OS properly support their displays, making the OSD controls you mention unnecessary.

In other words, you're asking why Apple doesn't have kludgey workarounds for a problem that doesn't exist on the Mac. It's not Apple's fault for not including unnecessary hacks, it's Windows'/F.E.A.R.'s fault that they need them.

In case you're wondering, this is what Mac users mean by "it just works". Why should a person have to worry about something the computer is fully capable of correctly doing itself?

Re:Not a hardware issue? (1)

snuf23 (182335) | about 8 years ago | (#15101470)

"It's not Apple's fault for not including unnecessary hacks, it's Windows'/F.E.A.R.'s fault that they need them."

You may not be familiar with how Window's works. You see third party companies make the hardware - not Microsoft. ATI in this case makes the video chip in the MacBook Pro. So first stop for blame should be ATI for not implementing this. Although as the other poster noted, it is in fact implemented in the ATI video driver. Now if the game manufacturer for whatever reason decides not to support widescreen gaming you can blame them too.
All modern widescreen capable video cards on Windows have these options by the way.

Re:Not a hardware issue? (1)

node 3 (115640) | about 8 years ago | (#15102020)

You see third party companies make the hardware - not Microsoft. ATI in this case makes the video chip in the MacBook Pro.

How is that different with Mac OS? ATI still makes the card, either way.

So first stop for blame should be ATI for not implementing this.

No, first stop for blame is Windows for not taking care of this sort of thing. This is exactly what OS's are supposed to do.

So you've clearly missed my point. It's this sort of thing that make Macs "just work". If MS doesn't take the initiative to make Windows "just work", and instead rely on third parties, they will always lag behind a company which takes these things seriously, like Apple does.

Re:Not a hardware issue? (2, Interesting)

snuf23 (182335) | about 8 years ago | (#15102289)

Sure. Like Apple doesn't work with ATI or Nvidia on any of it's drivers.
Apple supports a small subsection of hardware. Windows runs on a vast selection of hardware. I don't see this as being particularly comparable.
And I really wish you would tell the Mac users at my office that I support that it "just works" because they call me for support when it "just isn't working".
I use and work with OS X. It's a decent OS but it has it's problems and this bullshit "it just works" crap is getting seriously tired. It's like that "insanely great" crap all over again.

Re:Not a hardware issue? (1)

node 3 (115640) | about 8 years ago | (#15102747)

Apple supports a small subsection of hardware. Windows runs on a vast selection of hardware. I don't see this as being particularly comparable.

Because you aren't paying attention. It has nothing to do with the video driver, and everything to do with what services the OS provides.

It's not the driver's job to decide whether or not to scale the video. It's the OS's job to tell the driver what to do (and, optionally, the application's job to ask the OS to scale or not). Windows, apparently, doesn't do that. But this has absolutely nothing to do with "Windows has to support a vast amount of hardware".

I use and work with OS X. It's a decent OS but it has it's problems and this bullshit "it just works" crap is getting seriously tired.

You simply just don't understand what that phrase means. It doesn't mean things can't or don't go wrong. It means that you don't have to go through so many hoops to get things working. For most hardware, you just plug it in and "it just works". This scaling issue is just another example of that. For most things, you don't have to tinker around, you just use it.

Do you realize how abysmally ignorant you sound when you say something doesn't exist, when this entire thread is about an example of that very thing? If "it just works" isn't true, why does the Mac have no problems with varied aspect ratios and video scaling (in other words, "it just works"), yet on Windows it's a grab-bag where sometimes it works, sometimes you have to mess with the OSD on your display, sometimes you have to tinker with the video driver settings, sometimes you can choose the right setting in the game, and sometimes no matter what you do you can't get it to work right?

Re:Not a hardware issue? (2, Interesting)

snuf23 (182335) | about 8 years ago | (#15103146)

"It's not the driver's job to decide whether or not to scale the video. It's the OS's job to tell the driver what to do (and, optionally, the application's job to ask the OS to scale or not)."

So let me get this straight - it's the OS's responsibility to tell the underlying hardware what features it has? Even though the hardware may or may not support the feature? I be to differ. The driver on Windows exposes the hardware capabilities of the device to the operating system. So you don't have a situation where you have Windows attempting to force a 10 year old VGA card to do widescreen. You'd never have this problem on a Mac because you don't have to worry about old hardware. It's very easy for Apple to have the OS contain all of the information about any hardware it might need to run - after all Apple controls exactly that. ."For most hardware, you just plug it in and "it just works"."

I would change that to "for some hardware you just plug in and it just works most of the time except for when it doesn't".
Let me tell you a story about the Editorial department at a magazine a work for. We recently moved them up to OS X and guess what? All their digital voice recorders (USB devices) stopped working. Apparently there is no OS X support for them. And they are only about a year old. Wheee. It just didn't work and just hasn't worked and the staff has to just go out and buy new ones that do.
Or the staff member that was taking pictures on their digital camera and tried to move it to the Production Macintosh. Oops, no Mac support for that camera. It just didn't work. Had to plug it into a PC to extract the photos. Now by my analysis the camera manufacturer would be to blame, but by your's it's obviously the fault of Apple since the OS should handle this automagically! After all it "just works" with other cameras why doesn't it "just work" with this one? Of course the Production camera's work because we bought them specifically to be Mac compatible, but really shouldn't the OS support any camera by your logic?
Or maybe you can explain to my friend who ran a recent Apple update after which his wireless card no longer works? Would that be "just used to work"?

Do you realize how abysmally ignorant you sound spouting a marketing catch phrase over and over again? Do you work with Macs? I have 40 of them onsite here and have seen them screw the pooch often enough to realize that although the OS is good it has it's problems. It does not always "just work" and sometimes fixing the problem is far from trivial. Try dealing with font management on OS X in a print production environment. Holy shit I have I seen some weirdness.
It's impossible to have a reasonable conversation about OS X or Macs in general because of this whole starry eyed "it just works" oh thanks my savior Jobs viewpoint.
I've used Macs professionally for 16 years and am well versed in the good and bad points. Stop drooling on your MacBook and acting like an Apple marketing programmed robot.

Re:Not a hardware issue? (0)

Anonymous Coward | about 8 years ago | (#15104015)

Yeah, that's what "just works" means. It either works easily, or it's utterly fucked and you have to go spend more money, at which point it will work easily. Macs are for rich kids, they don't consider this a problem.

That's a far as hardware goes, anyway. Your software tribulations... well, Mac users aren't supposed to be doing anything complicated with them either. They're for noobs, not pros.

Re:Not a hardware issue? (1)

node 3 (115640) | about 8 years ago | (#15104738)

"It's the OS's job to tell the driver what to do (and, optionally, the application's job to ask the OS to scale or not)."

So let me get this straight - it's the OS's responsibility to tell the underlying hardware what features it has?


No. You even quoted me and got it wrong. I didn't say the OS should tell the hardware what it can do, but what to do.

You keep ignoring this video card issue as a perfect example. The card supports scaling. Mac OS can tell it to scale. Whether or not Windows can tell it to, it clearly did not.

An intelligent OS should be able to query the driver to find out if it supports a feature. If it does, it uses it, if it doesn't, and the feature is reasonable to implement in software, it should do that, entirely transparently to the user and the application. Windows clearly did not do this at all.

I would change that to "for some hardware you just plug in and it just works most of the time except for when it doesn't".

Then you would change to it mean absolutely nothing. That statement would be true for even the most obscure, least usable, poorly supported OS in the universe. But clearly, some OS's "just work" more than others. For example, MS DOS could hardly be said to "just work", Linux can, but usually doesn't, "just work", Windows rarely "just works", but doesn't take a lot of effort to get most things working. OS X is the only OS where the phrase "it just works" is a reasonable claim.

XP has come a long way, but it's still a far cry from OS X with regards to "it just works".

Of course the Production camera's work because we bought them specifically to be Mac compatible, but really shouldn't the OS support any camera by your logic?

Of course not. It should support any camera with drivers. If there was no Windows driver for the X1600, I wouldn't blame Windows for that. But if there is a driver (and there is), Windows should be intelligent enough to handle things that the user should not have to concern themselves with.

It does not always "just work" and sometimes fixing the problem is far from trivial.

I never said it always "just works".

It's impossible to have a reasonable conversation about OS X or Macs in general because of this whole starry eyed "it just works" oh thanks my savior Jobs viewpoint.

No, it's not impossible at all. It might be impossible for you, but that's because you don't understand the terms being used. "It just works" does not mean "it always just works" or "nothing can ever go wrong" or "all hardware works flawlessly". It means that the OS takes care of things so you don't have to. There are absolutely situations where this fails to be true, but it happens far, far more often on Mac OS X than it does under Windows or Linux.

If you re-read my posts, you'll realize I've tend to use qualifications ("it usually", "most of the time", "tends to", "it should", things like that), but you are ignoring them and pretending like I said, "it always". Obviously, if you hold me to that standard, I can't be right, but that's not what I'm claiming. You're wasting time disproving a claim no one is making.

Or let's put it a different way. Which OS, Windows or Mac OS X, tends to "just work" more often? Which does the right thing without requiring user intervention? Without popping up dialog boxes asking what you want to do? Without flashing icons everywhere letting you know it did exactly what you should expect it to do? Without running a wizard every time you first do something--a wizard which over 99% of the time is just an exercise in clicking "Next" a half dozen times?

*That's* what "it just works" means. In the "it just works" department, OS X is vastly superior to Windows--it's virtually impossible to have used both OS's for any significant period of time and not realize this. That doesn't mean a person is stupid for using Windows, that there's no reason to use Windows, or that Windows is not good at anything, so I don't see why you seem to be so defensive about this.

If this thread were about games, and you said, "Windows has many more games that OS X", would it make sense for someone to point out a list of games that Windows doesn't have in an effort to prove you wrong? Of course not. Because your claim wouldn't have been that the Mac has no games, or that there aren't games which are Mac only. It's just that Windows has more games.

Re:Not a hardware issue? (1)

snuf23 (182335) | about 8 years ago | (#15108292)

We obviously have a different understanding of the term "just works". In the English language if you say something "just works" it means exactly that. It works properly without any conditions. You adding conditions to the phrase are attempting to reinvent the language. Which is why I'm calling you an Apple marketing tool. "Just works" doesn't mean, mostly, more often than something else, with devices that have drivers etc. It means exactly what those two words mean. And by the English language meaning of the phrase it is not correct to state that OS X "just works". By using that language you are implying exactly what the words mean.
Mac OS X "it just works as long as proper conditions are met".

Re:Not a hardware issue? (1)

node 3 (115640) | about 8 years ago | (#15111642)

Now you're being abundantly silly. By your definition, absolutely nothing can be said to "just work". Nothing doesn't just work if the proper conditions aren't met.

You adding conditions to the phrase are attempting to reinvent the language.

No, I'm not. There is no requirement in the English language that all phrases must be absolutely and unqualifiedly true.

OK, your initial reaction is to take the phrase as an absolute and unqualified statement. You're not omniscient, and the language isn't perfect, so we're all bound to get things wrong now and then. But I've pointed out that the term is not meant to be taken that way. In other words, I've qualified the statement for you. At this point you really have no excuse for continuing to get it wrong in the exact same way.

If I tell you my car is hot, and you touch the air conditioner and say, "no, it's very cold", was I lying? If I say the ocean is deep, and you wade a few yards in and say, "no it's not, it barely comes to my knees", was I lying? Yes, both statements were said unqualified, but that doesn't make them wrong, except in a universal and absolute sense. Do you think it would be reasonable to hold every unqualified statement made to such a standard? You could try, but you're going to find that a lot of people will avoid talking with you at all costs.

You're deliberately treating the phrase "it just works" in a sense in which it's not meant. I'll attempt to explain it to you once again, but if all you're going to do is try to disprove the "unqualified and absolute" sense of the term, you're just wasting time, because no one is meaning it that way.

Take "it just works" as a quality, an attribute. Like a color. To what extent can an OS be said to "just work"? Similarly, to what extent can a building be said to be red? Your standard brick wall might be very red, with just a small amount of grey mortar. It's not "unqualifiedly and absolutely red", but it's not wrong to call it red. Given the processes which users go through, adding and removing hardware, adding and removing software, running software, accessing the Internet, and so on. To what extent does Windows "just work"? Under Windows, just about everything requires user interaction. Just about everything requires a wizard, or a choice of options, or installation of drivers, etc. None of these interactions are terribly difficult, but they are annoying and wholly unnecessary. Windows has a very low "just works" quotient. That's not an absolute. Sometimes Windows does just work. For example, you no longer have to reboot after plugging in a USB mouse.

For Mac OS X, the amount of work the user has to do to get something done is far less. Far, far more things are automatically and correctly configured. Installing applications virtually never requires any sort of "wizard" to set up. Mac OS X has a very high "just works" quotient.

Does that make sense to you? Do you understand now what people mean by "it just works"? Do you understand how something can be said to have a quality, even if that quality is not universal and absolute? (and in fact, rarely does an object have a singular quality universally and absolutely)

Re:Not a hardware issue? (1)

snuf23 (182335) | about 8 years ago | (#15117918)

My point is simple. Using the term "just works" in reference to OS X is simply a marketing spin term. And I find it an insulting one when things "just don't work".
Just like calling the Mac "insanely great" when some aspects were "insanely stupid" (the extension debacle in OS 9 and lower comes to mind).
The term is bandied about so much it's just ridiculous. You paint a magical picture of the perfect operating system, when it has plenty of issues. I'm not, nor have I been arguing Windows is better, just that at least with Windows you don't have a retarded level of evangelism that gushes about things "just working" and ignores what doesn't. As someone who uses Mac's in business I find that starry-eyed attitude irritating. Most of the people I see with it are consumer's who never dealt with Macs in a production environment.
We'll never agree on this - you think it's a valid "attribute" of an operating system, I think it's a subjective term that orginated in marketing and glosses over any deficiencies in the OS.

Re:Not a hardware issue? (1)

node 3 (115640) | about 8 years ago | (#15119072)

You paint a magical picture of the perfect operating system, when it has plenty of issues.

That's called a "straw man". It's not an argument I'm making (I've pointed this out many times), yet you keep trotting it out and skewering it, and pretending like you've just defeated my argument. You haven't. How many times do I have to say, "OS X isn't perfect" before you stop saying "aha, but don't you know? OS X isn't perfect! I bet you didn't know that!!"?

you think it's a valid "attribute" of an operating system, I think it's a subjective term

"Valid" and "subjective" are not mutually exclusive. I absolutely agree that it's subjective. But that does not mean it doesn't exist. You might as well claim a Mercedes doesn't have that "luxury" quality any more than a Chevy Nova, since "luxury" is subjective and used by Mercedes' marketing firm. It's certainly possible to make and defend such a claim--all you have to do is ignore any measurement of "luxury". That doesn't change the fact that the Mercedes really is more luxurious, it just makes you look like... Well, let's just say it doesn't put you grasp of the subject in a very good light.

This is about the point where you jump in with "Ah! But OS X can't have the quality of 'it just works' because it is imperfect!"

<sigh>

I'm not saying OS X 100% "just works". Just that "it just works" is a significant attribute of Mac OS X. Not a perfect and absolute attribute, just a significant one. And that it's an attribute OS X has in great excess when compared Windows or Linux. That's not to say Windows and Linux don't excel in other areas, just that in the "it just works" dept, they are notably inferior to OS X.

Re:Not a hardware issue? (1)

snuf23 (182335) | about 8 years ago | (#15144350)

Yes and in my world there is no "it just works" dept. It's a bullshit term that was made to sound oh so cute by Apple. And you are a tool for using it quotes an all. I don't consider it to be a siginificant attribute of any operating system. I consider it to be an invented term in the context of how you are using. It is entirely subjective. For example if a Windows user or a Linux user has an expectation of how some function is performed and it doesn't work that way on Mac OS X it will not "just work" for them. I have users who can't grasp the concept that when you close all of the windows the application stays open. They don't grasp that the top menu bar changes when you are on different applications. For them, this doesn't work, no matter how many times I've told them how it works. One of these people has used Macs all his life. Previous to OS X I would get a call at least once a month where he would say he can't open program X because he was out of memory but he says that he has closed all of his applications. Of course he has just closed the windows and not quit the application so his memory is still being used by the applications. Now this guy has used Macs for 15 years and he still doesn't get that one concept. It's not "just working" for him, it causes him confusion.
Bottom line, it's a stupid marketing term spun by Jobs that is completely overused and helps to create false impressions of the reliability of the operating system. I glad you like it so much, I'm sick of having to explain to users who are confused because their OS isn't "just working" like it does for the smarmy man in the turtleneck.

You're wasting your time (0)

Anonymous Coward | about 8 years ago | (#15171848)

You're just wasting your time with that troll - it's been know to twist others' words repeatedly in a vain attempt to always get the last word with its flawed arguments.

Re:Not a hardware issue? (1)

snuf23 (182335) | about 8 years ago | (#15108962)

"You keep ignoring this video card issue as a perfect example. The card supports scaling. Mac OS can tell it to scale. Whether or not Windows can tell it to, it clearly did not.

An intelligent OS should be able to query the driver to find out if it supports a feature. If it does, it uses it, if it doesn't, and the feature is reasonable to implement in software, it should do that, entirely transparently to the user and the application. Windows clearly did not do this at all."


Oh yeah and one question regarding this point. So based on what you're telling me - if you have a widescreen monitor Mac OS X will setup all of your games to use widescreen? So then say an older game like say Starcraft will support widescreen display with an expanded battlefield and properly sized interface? Because the OS told it that it's using a widescreen monitor? That would be interesting to see. It's exactly what you are implying.
On Windows it works like this: if you have a widescreen setup with a widescreen capable video card, then your desktop will work fine with it. Full screen applications such as games offer to support how ever many resolutions the developer chooses to support. So older games for example may not support widescreen because they are not widescreen aware. The hardware didn't exist at the time they were made. If a new game doesn't support widescreen it is because the developers chose not to support it.

Re:Not a hardware issue? (1)

node 3 (115640) | about 8 years ago | (#15111749)

So then say an older game like say Starcraft will support widescreen display with an expanded battlefield and properly sized interface? Because the OS told it that it's using a widescreen monitor? That would be interesting to see. It's exactly what you are implying.

Sometimes, yes, that's exactly what happens. For a game that's hard-coded to 4:3, for whatever reason, it's my experience that OS X will do a sort of "letterboxing", except that the black bars are on the either side of the game, so that the game is still 4:3, and not awkwardly scaled to the widescreen ratio.

I can think of no technological reason why a graphics interface (DirectX, on Windows) can't do this automatically for a game, even if that game was written prior to the availability of modern widescreen displays.

Apple puts a lot of effort into making things work correctly with a minimal amount of effort on the end-user and the developer. Microsoft does not. I don't see why you're finding it so hard to believe that such a disparity of effort won't manifest itself in the end product. For a Windows (or Linux) user switching to a Mac, things "just work" so often it's amazing. Booting from an external drive? It just works. Networking? It just works. How do you install an app? Just run it--you don't have to install (but you'll generally want to copy it to the Applications folder so you know where it's at). How do you uninstall? Put it into the trash. How do you install a screensaver? Just double-click on it. Same with dashboard widgets. How do you tell the OS what application can open a file type? You don't have to do anything. So long as the appropriate application is merely on a drive that the OS can see, it will use it to open a file. These things all "just work" and they do so because Apple put the effort in to make them work. These things don't "just work" on Windows because MS hasn't put in the effort.

And, no, not everything in Mac OS X "just works", and sometimes things can go horribly wrong. But OS X "just works" far more often than Windows does.

Re:Not a hardware issue? (1)

Mantaman (948891) | about 8 years ago | (#15106510)

No, first stop for blame is Windows for not taking care of this sort of thing. This is exactly what OS's are supposed to do.

Your RIGHT!!

AFTER Windows has checked the drivers to see what its supposed to be able to support.

If the driver says it can support YxZ rez then windows allows you to select YxZ rez (for games blame the makers), if not then how the hell is windows supposed to know .. magic! .. little pixies!

With a Mac it will do the same thing but as the hardware is so tight with the OS they will write to that standard of the screen.

What im thinking is that the ATI card is specific to the Mac and they dont have the windows drivers ready for it - mabe im wrong i dont know what the ATI card is.

The reason why it 'just works' is because the HW is made to Mac standards not the million and 1 different HW setups for windows - this is why windows drivers are important.

Re:Not a hardware issue? (1)

node 3 (115640) | about 8 years ago | (#15107281)

What im thinking is that the ATI card is specific to the Mac and they dont have the windows drivers ready for it - mabe im wrong i dont know what the ATI card is.

No, there are Windows drivers for all the video cards in all the Intel Macs.

The reason why it 'just works' is because the HW is made to Mac standards not the million and 1 different HW setups for windows - this is why windows drivers are important.

Drivers are important for the Mac, as well. Mac OS X currently supports a large number of ATI and Nvidia cards.

Your argument doesn't hold up. This isn't some oddball, no-name, Taiwanese video card which doesn't work well with the chipmaker's drivers. This is a high quality ATI card with ATI-supplied drivers for both OS X and Windows. In other words, this isn't some strange situation that MS couldn't possibly foresee, and which is the exact opposite of what on would normally want to happen.

This is exactly the point I've been making. On the Mac, things just work. The user doesn't have to futz around with changing the scaling options on their display, or on their video driver.

On Windows, even with a brand-name card with brand-name drivers and a high profile game, it doesn't get things right. All Windows would have to do is notice the game is asking for a 4:3 screen, and the display is widescreen, and slap two black rectangles on either side, and it should be automatic unless the application specifically requests otherwise.

Widescreens are common, ATI cards are common, 4:3 games are common. There's absolutely no reason Windows shouldn't automatically and accurately handle this sort of situation.

Re:Not a hardware issue? (2, Interesting)

aristotle-dude (626586) | about 8 years ago | (#15101308)

They are there. Those are options in the drivers for ATI cards at least. The difference betwen Windows and OS X is that that latter offers control for such features outside of the driver.

widescreen gaming (4, Informative)

gEvil (beta) (945888) | about 8 years ago | (#15100394)

Handy link to the Widescreen Gaming Forum [widescreen...gforum.com] website. It includes a listing of games that work with widescreen monitors, including hacks, patches, and workarounds to get games that don't natively support them to work.

Re:widescreen gaming (1)

Bios_Hakr (68586) | about 8 years ago | (#15103325)

I find it hard to believe that OSX video drivers have no system to display apps without stretching.

In the Control Panel of both Nvidia and ATI drivers, there is a setting that will allow a 1024*768 game to run with the other pixels blocked out. The other 256*256 pixels become a border around the actual game. That way, everything looks OK and not all stretched out.

This is very useful considering most LCDs are 1280*1024 and most games are designed to play at *real* resolutions; i.e. 1280*960.

...and that's with an underclocked GPU (2, Interesting)

frankie (91710) | about 8 years ago | (#15100412)

If you pump up the clock [google.com] with ATITool, frame rates jump 30-50% (at the cost of your Mac being unseemly noisy and warm).

Now you just need some blue neon - and maybe a carbon fiber spoiler on top - to give your iMac that Real Ultimate (gaming) Power! (tm)

Re:...and that's with an underclocked GPU (0)

Anonymous Coward | about 8 years ago | (#15100784)

Now you just need some blue neon - and maybe a carbon fiber spoiler on top - to give your iMac that Real Ultimate (gaming) Power! (tm)

Don't do it! My cousin did that once, but it was all a bit too much for him, and he flipped out and killed the whole town. I have to admit, the blue neon was totally sweet though.

Re:...and that's with an underclocked GPU (1)

CamD (964822) | about 8 years ago | (#15103839)

pfff, that's no big deal. These guys are so crazy and awesome that they flip out ALL the time. The purpose of modders is to flip out and kill people. They don't even think twice about it! And that's what I call REAL Ultimate Power!!!!!!!!!! [realultimatepower.net]

Played FFXI on my MacBookPro (2, Interesting)

falcon5768 (629591) | about 8 years ago | (#15100461)

and even with everything turned up and running a Dynamis Xacrabard (where there tends to be a huge number of monsters at once along with 50-64 player characters) I didnt have one instance of a slow down or a lag which even some of my friends with nice systems couldnt brag about.

Of course its a older game, but its much more prossesor heavy than you would think based on how SE botched up the coding for PC.

Half-Life 2 (3, Informative)

aftk2 (556992) | about 8 years ago | (#15100577)

Cabel (of the Mac software shop Panic [panic.com] ) has put up a quicktime video of Half-Life 2 running on his Intel iMac. In two words, it looks friggin sweet:

http://cabel.name/ [cabel.name]

(With apologies to his hosting provider.)

Re:Half-Life 2 (0)

Anonymous Coward | about 8 years ago | (#15101237)

Please don't take this the wrong way but Big Deal? I have no idea why you'd want to watch a video of HL2 playing on a PC. Let's no forget something, Apple now makes "IBM PCs". They just happen to be expensive and come with OS that many people like. I'd be more suprised if it didn't play HL2.

Now if Apple would just license OS X and let me put it on a $600 Dell laptop...I guy can dream.

Re:Half-Life 2 (1)

aftk2 (556992) | about 8 years ago | (#15114209)

You get no argument for me - but I'm mostly impressed because every time a new Mac model is released, there are lots of posts (primarily in MacRumors) that whine about the capabilities of the graphics card. It doesn't help that sometimes ports of games to the Mac have been pretty sloppy (SimCity 4, I'm looking at you). So the idea that these machines can game passably is pretty impressive to me, nonetheless.

What is a die-hard linux person to think? (2, Funny)

Jeff DeMaagd (2015) | about 8 years ago | (#15100690)

After all, you have a triumverate of "evil" going on here. After all, it is an Apple machine with Intel chips running Microsoft software.

OS X gaming largely unaffected... (2, Insightful)

Faust7 (314817) | about 8 years ago | (#15100882)

All the people crying that Boot Camp means the end of OS X gaming need to remember a certain reality: no software company with any sense will shut down a business unit that remains consistently profitable. So long as native OS X versions of software continue to bring in money for the companies that create them (Aspyr, Adobe, Microsoft, etc.), they'll stick around.

So the question is, would enough people keep using native OS X apps, thereby maintaining that profitability? I'd say yes, and I'd also say that Boot Camp really won't have much of an overall effect beyond increasing the Mac's market share slightly (and only slightly, because setting up dual-booting is an extra cost in terms of the XP license and the time involved to make it happen); Boot Camp is aimed at people for whom Windows is the exception, not the rule - i.e. people that always use native OS X apps if they're available. I honestly don't see this radically changing anything.

Re:OS X gaming largely unaffected... (1)

JamesNK (967097) | about 8 years ago | (#15101374)

I wrote a post on this a couple of days ago. I suspect the increased competition from Windows software combined with Apple's small install base could be quite negative for Mac developers. http://www.newtonsoft.com/blog/archive/2006/04/08/ 38.aspx [newtonsoft.com] No one is going to stop selling what they already have. Most of the cost of software is the development. Not developing new software is pretty much the equivalent of leaving the market.

Re:OS X gaming largely unaffected... (1, Insightful)

Anonymous Coward | about 8 years ago | (#15102011)

Adobe, Microsoft etc. aren't going anywhere. Pro OS X users aren't going to stand for being forced to boot into Windows to do their day-to-day work, and developers would be more likely to lose a large base of paying customers if they were to do so. Mac users are a vocal minority (and some might argue trend-setting), and for Photoshop or Office to drop out would leave a gaping hole for someone else to fill, which would threaten their position as the "standard" application of their field. Look at the damage Quark did to itself by shrugging their shoulders at OS X, they let Indesign walk in and take a huge chunk of, if not take over completely a field they once completely dominated.

Also... Apple's been on a development tear themselves lately, and if a critical application does drop out they may fill the gap themselves.

Games, that may be a lost cause, but they were never really there to begin with (and I can't see MacPlay or Aspyr going down without a fight.) The biggest danger is losing "boutique" apps, but for the most part these were never ported to begin with.

Re:OS X gaming largely unaffected... (1)

AaronLawrence (600990) | about 8 years ago | (#15104367)

no software company with any sense will shut down a business unit that remains consistently profitable
Not true. If they can make MORE profit by doing something else, they will shut it down in a flash. This happens all the time in software and other things. Just because your 50 developers can make a small profit on Product A, doesn't mean you'll keep them going on that - make Product B and it can make a huge profit with the same 50 developers.

I haven't tried it, but... (1)

chaboud (231590) | about 8 years ago | (#15101679)

I don't have a MacBook Pro (so I haven't given this a shot) but people wanting to wide-screen F.E.A.R. should look here [widescreen...gforum.com] . It's a pretty easy hack to get the game running properly on wide-screen displays.

The Wide-screen gaming forum [widescreen...gforum.com] has tons of simple fixes for quite a few games.

Oblivion on Mac Mini (0)

Anonymous Coward | about 8 years ago | (#15102475)

What are the chances of playing Oblivion at semi-decent res/FPS on the cheapest Mac Mini?? How about the Mini with faster clock speed and 1GB of RAM?

I just need my hit of ES is all....

Thanks
Tim

The benchmark I want to see..... (2, Insightful)

Slashcrap (869349) | about 8 years ago | (#15104253)

These benchmarks of Windows games running on XP on an Intel Mac are all very interesting - I mean who would have thought that a standard Intel laptop with an Apple logo on it would have performance roughly equivalent to a standard Intel laptop without an Apple logo on it?

But so far no-one seems to have gotten around to benchmarking the Intel Mac running a cross platform game under both Windows and OSX.

I just don't understand that. Is it possible that OSX would score too highly and the Apple crowd don't want to embarass the Windows users? That's got to be it.

Widescreen Support? (1)

Khaotix (229171) | about 8 years ago | (#15109571)

F.E.A.R. runs widescreen just fine on my Dell 20.1" ... never had issues with widescreen support.
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...