**EDIT: TERRIBLE APOLOGY, I accidentally calculated using 560 Ti. I'm fixing it now.**
Edit: I have fixed it, but there may be some leftovers.

29582351c3

Aven Donn

29582351c3

Aven Donn

29582351c3

The AMD 7750 is the fastest card on the market that does not require extra power connectors.

Because you know, saving the odd 60 watts is GREAT for a MASSIVE loss in performance.

In specs, the 550 Ti is superior. Why would you take the 7750 just because it consumes less power?

What are you talking about. They are pretty close in performance. It is a far more efficient card and you have to account for the power bill into the cost too.

http://www.hwcompare.com/11741/geforce-gtx-550-ti-vs-radeon-hd-7750/

I beg to differ. And that's a pathetic excuse. I have to account for the power bill into the cost too? The power bill is not going to be drastically higher and it's not like you could add more power to the 7750 to reach 550 Ti performance. Well, you could, by overclocking, but good luck. Overclocking it that much would just cause tons of problems.

Now just looking at the specs, I can see it's inferior. 128bit bus width, vs 196bit. That's a massive difference. Fewer render output units as well... Hell even the core clock is lower. Only department it really wins in, is unified shaders. And you simply can't compare those between different companies, or even different card generations. Only between cards of the same generation and company. (Because a unified shader from nVidia isn't the same as a unified shader from ATI)

But please, correct me if I'm wrong. How much would the 60W difference cost per year, if the computer was running at full power for the entire year? A year has 8760 hours. How much does 60W cost per hour? If my quick math isn't mistaken, even at high electricty prices, it amounts to roughly 7 dollars.

Those bar charts don't say anything meaningful. How about you show some actual game benchmarks where there is a drastic difference. The only benchmarks I find show the 7750 being close to or trounced by the 560Ti, but that is even a tier or 2 higher than the 550Ti which itself is only 1 tier over a 7750 at most.

My calculation says 60W per hour 24/7 for a year is approx $50 at $0.10 per KWH. If OP lives in Hawaii she can be paying close to $0.40 per KWH for a greater than $200 per year difference. Other places it can be a bit below $0.10 per KWH.

First, I'll double check your math. KW/h is 1000W/h So we take the price of one kilowatt per hour and divide it by 1000 to get the price of a watt per hour. Then we multiply by 60 to get the price of 60 watts per hour. Then we multiply by 8760 to get a full year of full power running. You peg your price at 10 cents per 1000 watt hours. To ensure no decimal fumbling is done, we will count in cents, not dollars.

10/1000 = 0.01

0.01*60 = 0.6

0.6 * 8760 = 5256

5256/100 = $52.56 per year, at constant operation, at full power draw. So yes, you were close. I'll hand you that and trust your Hawaii math.

So... Essentially the money you save, if you live in Hawai, by buying a 7750, would easily cover a new GPU. However, let's try to get some real numbers now. Assuming he turns off his computer every night (The gap for leaving overnight to download is covered by the error margin and the fact the GPU doesn't really do anything when downloading overnight)

So let's chop off 8 hours per day already. That leaves us with 5840 hours.

Further, it's not likely the computer will be used for gaming or high-power rendering tasks all day long, right? So let's chop off roughly 3 hours to account for that. If you think this number is too big, please tell me why and I'll redo the math. I gotta admit I'm quite enjoying this.

So that leaves us at... 4745 hours per day.

Now to remember that GPUs rarely even reach their max TDP. But I may be wrong here, so we'll completely ignore this bit for our calculation.

What about non-power intensive rendering? Let's ignore those as well. Let's assume that all the time we have left is 100% power load. I'll use the Hawaii price of 40 cents per 1000W hours.

We're at $113.88 per year for Hawaii. Still considerable I have to admit. Now, I'll go look for actual benchmarks.

Tom's hardware benchmarks. For some reason they're not showing a high quality benchmark so we have to go with their medium quality benchmark.

So they come very, very close.

So we reached a difference of $113.88 per year in Hawai. Let's look up the average price of electricity in the USA.

First off, in my search I didn't find a single place that said electricity in Hawai is 40 cents per kilowatt hour. I found 27 at worst. Residential sector of course.

Average price, high estimate, seems to be around 20. I didn't do exact math, but please feel free to. And please, cite your sources. I'm not citing mine because I'm not really willing to testify for their accuracy and because I didn't put much effort into looking them up. I'm assuming you have, so we'll go by your sources if they contradict mine.

Now let's calculate how much difference they make at 20 cents per hour...

We can just halve our $113.88 to get $56.94 per year. Double check the math if you don't believe it.

So... While in Hawaii this might be a bit higher... We gotta remember, that the actual power difference per year is going to be a far smaller cost. What we did here is called a conservative estimate. Or better yet, a maximum. At $56.94 per year... I'd say it's worth it.

Edit: And I looked up how much each card costs. The 5770 is significantly cheaper. So allow me to announce this properly.

**You have won this argument fair and square, the 5770 wins as most worthwhile card in terms of performance per dollar, both initially and over time. I salute you for proving me wrong. **