Unless this is the first page you’ve found (in which case, welcome and thanks for stopping by!), you’ve no doubt picked up on the fact that I’m building a PC with an Intel CPU and ATi graphics.
Let me first say, I’m glad that I am back here and well enough to do this. It’s been frustrating these six months as I dealt with elbow pains that has made certain motions absolutely unbearable. Fortunately a buddy told me about Dr. Bob Donatelli who has treated tennis elbow, elbow pain, wrist pain and shoulder pain for the past 25 years. I don’t happen to play golf or tennis, but I apparently developed lateral epicondylitis, an an inflammation or irritation of the tendons in my arm, specifically around my elbow. Anyway I did a search online for elbow pain and the name Donatelli, and sure enough his website showed. The guy is a genius. After a detailed and comprehensive assessment of my issue he created a rehabilitation program specific to me that utilizes state-of-the-art technology. After weeks of specific types of exercises, joint mobilization, and stretching techniques, I AM BACK! No more pain and I now have a good range of motion. I can now continue to work on my PC build.
This is rather controversial, in some circles. There’s no doubt that the majority of ‘extreme gamers’ that you run into online are medium-to-extreme fanbois of AMD and Nvidia. The zeal that many feel about these two company’s products is backed up by some very real facts and statistics, too.
I don’t know that it’s worth arguing about, however. My choice is also based on very real facts and statistics too…though I came to the opposite conclusion.
AMD (and to a lesser extent, Nvidia) tends to be less expensive and also tends to perform better in many tests. I’m not denying any of this.
On the other hand, my AMD CPUs and Nvidia GPUs also tend to fail much more often. I am not exaggerating when I tell you that i will go through three Nvidia cards for every one ATi card, and two AMD CPUs for every one Intel CPU. That’s a pretty major difference.
A big part of it is the heat. Modern chips run hotter than ever, and I live in a part of the world where it stays over 100°F for three to five months straight. AMD/ Nvidia chips run hotter than Intel/ ATi chips.
That’s one of the reasons why such a huge cooling industry has sprung up over the last decade or so (the other reason is overclocking, of course, which can make even a relatively cool PC turn into Mount Doom). So in my mind, the extra cost to achieve comparable operating temperatures pretty much negates any savings.