As posted on my tumblr:
Why in the hell do people cling to Windows XP? Yeah, I used to like 2000 and XP back in the day, but that day was almost a bloody decade ago. I can't even stand using it part-time in a VM nowadays because of its, uh, "intricacies" (to put it politely).
It sucks. Upgrade to either Windows 7 or Mac OS X already! I'm running Windows 7 perfectly fine on a five year old laptop. If your machine can't handle it, you're due for a system upgrade.
Posted because I'm having a horrendous time with its "secure" file sharing. I'm not putting a fucking password on my iTunes VM. That would just be stupid. And I don't want to couple a password with automatic logon just to satisfy some lame kludge that "enhances security" (protip from a former network admin: it fucking doesn't; all it does is annoy administrators and people who just want their fucking anonymous shares to work).
It's a good thing I have XP Pro running in my VM, since XP Home is much more guarded about its group policies for some reason (which, again, makes no fucking sense! Group policies cannot be fully utilized because you can't join domains, so the only reason you'd want them is to disable stupid fucking handholding features...thank goodness for Microsoft getting a damned clue with Vista and keeping that -- and the graphical ACL editor -- unlocked in the home editions).
it's a nice game because i put in the disc, turn on the console, and unlike pc shit most of the goddamn time, it works.
Eh, Steam did make things a tad better.
I was particularly impressed with Mafia II for PC. I was literally able to start the game up and click play. In the case of both my system and my dad's system (my dad's system resembling my old one, only with a faster dual core CPU rather than a slower quad core), it automatically picked display options that provided an excellent mixture of performance and quality. The game both looks nice and plays smoothly. I haven't seen a game do quite as good of a job as that before; usually they wind up dramatically underestimating your computing power, even if your system is from the same era as the game.
Of course, there's no telling how it's going to handle systems that are newer than it, but from what I've seen of it, it probably has the best set of optimized settings I've seen for the computers of now.
That being said, I really hate the way developers are treating the PC, and not just in control sets. In particular, I'm kind of angsty at the way game developers treat ATI owners. Almost every fucking time I see a "Designed for nVidia" logo on a game, it seems to be broken on ATI cards in some way (with Mafia II being one notable exception). Usually the excuse is that "wahhh ATI has buggy drivers lol." Right. In that case, explain why companies that hire competent -- sometimes world-renowned programmers -- like id Software and Epic Games, don't seem to have any issue supporting
everything out of the box with their engines. Say what you will about Unreal Tournament 3, but that game can play at an extremely playable framerate on my laptop, and the only thing that its video card can beat nowadays are Intel's chips (though it can play a mean game of Doom 3 and FEAR).
Also, I really, really doubt that "driver bugs" are really the issue here. Batman: Arkham Asylum is a particularly notorious example, as it locks out (or used to, at any rate)
anti-aliasing on ATI cards. Needless to say, the game has a "designed for nVidia" logo, and when anti-aliasing is hacked on it works perfectly fine on ATI cards. Even if there were driver bugs, you'd think that they would work around some of the "bugs," considering ATI has roughly 50% of the market share for discreet video chipsets. You know, the thing that developers
usually do when certain video card driver releases have bugs.
Don't even get me started on all of the PhysX bullshit. Between them choosing to use the slowest possible x86 instructions to do math to cripple operation on CPUs, limiting support to nVidia cards, and preventing people from using a hybrid configuration (i.e. nVidia GPU for physics processing and an ATI GPU for rendering), I don't think I really need to say anything. I could understand nVidia limiting support to, well, nVidia cards, but the other two issues are just inexcusable.
The long and short of it: nVidia makes some rockin' hardware, but the company itself is worse than Microsoft in the late 90s. Fuck 'em.
...wow, that was quite a tangent.