Welcome to Gaia! ::


Aven Donn
29582351c3
Aven Donn
29582351c3
The AMD 7750 is the fastest card on the market that does not require extra power connectors.


Because you know, saving the odd 60 watts is GREAT for a MASSIVE loss in performance.

In specs, the 550 Ti is superior. Why would you take the 7750 just because it consumes less power?
What are you talking about. They are pretty close in performance. It is a far more efficient card and you have to account for the power bill into the cost too.


http://www.hwcompare.com/11741/geforce-gtx-550-ti-vs-radeon-hd-7750/

I beg to differ. And that's a pathetic excuse. I have to account for the power bill into the cost too? The power bill is not going to be drastically higher and it's not like you could add more power to the 7750 to reach 550 Ti performance. Well, you could, by overclocking, but good luck. Overclocking it that much would just cause tons of problems.

Now just looking at the specs, I can see it's inferior. 128bit bus width, vs 196bit. That's a massive difference. Fewer render output units as well... Hell even the core clock is lower. Only department it really wins in, is unified shaders. And you simply can't compare those between different companies, or even different card generations. Only between cards of the same generation and company. (Because a unified shader from nVidia isn't the same as a unified shader from ATI)

But please, correct me if I'm wrong. How much would the 60W difference cost per year, if the computer was running at full power for the entire year? A year has 8760 hours. How much does 60W cost per hour? If my quick math isn't mistaken, even at high electricty prices, it amounts to roughly 7 dollars.
Those bar charts don't say anything meaningful. How about you show some actual game benchmarks where there is a drastic difference. The only benchmarks I find show the 7750 being close to or trounced by the 560Ti, but that is even a tier or 2 higher than the 550Ti which itself is only 1 tier over a 7750 at most.

My calculation says 60W per hour 24/7 for a year is approx $50 at $0.10 per KWH. If OP lives in Hawaii she can be paying close to $0.40 per KWH for a greater than $200 per year difference. Other places it can be a bit below $0.10 per KWH.

Shameless Enabler

29582351c3
Aven Donn
29582351c3
The AMD 7750 is the fastest card on the market that does not require extra power connectors.


Because you know, saving the odd 60 watts is GREAT for a MASSIVE loss in performance.

In specs, the 550 Ti is superior. Why would you take the 7750 just because it consumes less power?
What are you talking about. They are pretty close in performance. It is a far more efficient card and you have to account for the power bill into the cost too.
He probably does not pay the power bill at his house.
EDIT: TERRIBLE APOLOGY, I accidentally calculated using 560 Ti. I'm fixing it now.


Edit: I have fixed it, but there may be some leftovers.


29582351c3
Aven Donn
29582351c3
Aven Donn
29582351c3
The AMD 7750 is the fastest card on the market that does not require extra power connectors.


Because you know, saving the odd 60 watts is GREAT for a MASSIVE loss in performance.

In specs, the 550 Ti is superior. Why would you take the 7750 just because it consumes less power?
What are you talking about. They are pretty close in performance. It is a far more efficient card and you have to account for the power bill into the cost too.


http://www.hwcompare.com/11741/geforce-gtx-550-ti-vs-radeon-hd-7750/

I beg to differ. And that's a pathetic excuse. I have to account for the power bill into the cost too? The power bill is not going to be drastically higher and it's not like you could add more power to the 7750 to reach 550 Ti performance. Well, you could, by overclocking, but good luck. Overclocking it that much would just cause tons of problems.

Now just looking at the specs, I can see it's inferior. 128bit bus width, vs 196bit. That's a massive difference. Fewer render output units as well... Hell even the core clock is lower. Only department it really wins in, is unified shaders. And you simply can't compare those between different companies, or even different card generations. Only between cards of the same generation and company. (Because a unified shader from nVidia isn't the same as a unified shader from ATI)

But please, correct me if I'm wrong. How much would the 60W difference cost per year, if the computer was running at full power for the entire year? A year has 8760 hours. How much does 60W cost per hour? If my quick math isn't mistaken, even at high electricty prices, it amounts to roughly 7 dollars.
Those bar charts don't say anything meaningful. How about you show some actual game benchmarks where there is a drastic difference. The only benchmarks I find show the 7750 being close to or trounced by the 560Ti, but that is even a tier or 2 higher than the 550Ti which itself is only 1 tier over a 7750 at most.

My calculation says 60W per hour 24/7 for a year is approx $50 at $0.10 per KWH. If OP lives in Hawaii she can be paying close to $0.40 per KWH for a greater than $200 per year difference. Other places it can be a bit below $0.10 per KWH.


First, I'll double check your math. KW/h is 1000W/h So we take the price of one kilowatt per hour and divide it by 1000 to get the price of a watt per hour. Then we multiply by 60 to get the price of 60 watts per hour. Then we multiply by 8760 to get a full year of full power running. You peg your price at 10 cents per 1000 watt hours. To ensure no decimal fumbling is done, we will count in cents, not dollars.

10/1000 = 0.01

0.01*60 = 0.6

0.6 * 8760 = 5256

5256/100 = $52.56 per year, at constant operation, at full power draw. So yes, you were close. I'll hand you that and trust your Hawaii math.

So... Essentially the money you save, if you live in Hawai, by buying a 7750, would easily cover a new GPU. However, let's try to get some real numbers now. Assuming he turns off his computer every night (The gap for leaving overnight to download is covered by the error margin and the fact the GPU doesn't really do anything when downloading overnight)

So let's chop off 8 hours per day already. That leaves us with 5840 hours.

Further, it's not likely the computer will be used for gaming or high-power rendering tasks all day long, right? So let's chop off roughly 3 hours to account for that. If you think this number is too big, please tell me why and I'll redo the math. I gotta admit I'm quite enjoying this.

So that leaves us at... 4745 hours per day.

Now to remember that GPUs rarely even reach their max TDP. But I may be wrong here, so we'll completely ignore this bit for our calculation.

What about non-power intensive rendering? Let's ignore those as well. Let's assume that all the time we have left is 100% power load. I'll use the Hawaii price of 40 cents per 1000W hours.

We're at $113.88 per year for Hawaii. Still considerable I have to admit. Now, I'll go look for actual benchmarks.

Tom's hardware benchmarks. For some reason they're not showing a high quality benchmark so we have to go with their medium quality benchmark.

User Image - Blocked by "Display Image" Settings. Click to show.

User Image - Blocked by "Display Image" Settings. Click to show.

User Image - Blocked by "Display Image" Settings. Click to show.

So they come very, very close.

So we reached a difference of $113.88 per year in Hawai. Let's look up the average price of electricity in the USA.

First off, in my search I didn't find a single place that said electricity in Hawai is 40 cents per kilowatt hour. I found 27 at worst. Residential sector of course.

Average price, high estimate, seems to be around 20. I didn't do exact math, but please feel free to. And please, cite your sources. I'm not citing mine because I'm not really willing to testify for their accuracy and because I didn't put much effort into looking them up. I'm assuming you have, so we'll go by your sources if they contradict mine.

Now let's calculate how much difference they make at 20 cents per hour...

We can just halve our $113.88 to get $56.94 per year. Double check the math if you don't believe it.

So... While in Hawaii this might be a bit higher... We gotta remember, that the actual power difference per year is going to be a far smaller cost. What we did here is called a conservative estimate. Or better yet, a maximum. At $56.94 per year... I'd say it's worth it.

Edit: And I looked up how much each card costs. The 5770 is significantly cheaper. So allow me to announce this properly.

You have won this argument fair and square, the 5770 wins as most worthwhile card in terms of performance per dollar, both initially and over time. I salute you for proving me wrong.
nouveau sereph
29582351c3
Aven Donn
29582351c3
The AMD 7750 is the fastest card on the market that does not require extra power connectors.


Because you know, saving the odd 60 watts is GREAT for a MASSIVE loss in performance.

In specs, the 550 Ti is superior. Why would you take the 7750 just because it consumes less power?
What are you talking about. They are pretty close in performance. It is a far more efficient card and you have to account for the power bill into the cost too.
He probably does not pay the power bill at his house.
We can't assume that, can we?
Aven Donn
29582351c3
Aven Donn
29582351c3
Aven Donn
29582351c3
The AMD 7750 is the fastest card on the market that does not require extra power connectors.


Because you know, saving the odd 60 watts is GREAT for a MASSIVE loss in performance.

In specs, the 550 Ti is superior. Why would you take the 7750 just because it consumes less power?
What are you talking about. They are pretty close in performance. It is a far more efficient card and you have to account for the power bill into the cost too.


http://www.hwcompare.com/11741/geforce-gtx-550-ti-vs-radeon-hd-7750/

I beg to differ. And that's a pathetic excuse. I have to account for the power bill into the cost too? The power bill is not going to be drastically higher and it's not like you could add more power to the 7750 to reach 550 Ti performance. Well, you could, by overclocking, but good luck. Overclocking it that much would just cause tons of problems.

Now just looking at the specs, I can see it's inferior. 128bit bus width, vs 196bit. That's a massive difference. Fewer render output units as well... Hell even the core clock is lower. Only department it really wins in, is unified shaders. And you simply can't compare those between different companies, or even different card generations. Only between cards of the same generation and company. (Because a unified shader from nVidia isn't the same as a unified shader from ATI)

But please, correct me if I'm wrong. How much would the 60W difference cost per year, if the computer was running at full power for the entire year? A year has 8760 hours. How much does 60W cost per hour? If my quick math isn't mistaken, even at high electricty prices, it amounts to roughly 7 dollars.
Those bar charts don't say anything meaningful. How about you show some actual game benchmarks where there is a drastic difference. The only benchmarks I find show the 7750 being close to or trounced by the 560Ti, but that is even a tier or 2 higher than the 550Ti which itself is only 1 tier over a 7750 at most.

My calculation says 60W per hour 24/7 for a year is approx $50 at $0.10 per KWH. If OP lives in Hawaii she can be paying close to $0.40 per KWH for a greater than $200 per year difference. Other places it can be a bit below $0.10 per KWH.


First, I'll double check your math. KW/h is 1000W/h So we take the price of one kilowatt per hour and divide it by 1000 to get the price of a watt per hour. Then we multiply by 60 to get the price of 60 watts per hour. Then we multiply by 8760 to get a full year of full power running. You peg your price at 10 cents per 1000 watt hours. To ensure no decimal fumbling is done, we will count in cents, not dollars.

10/1000 = 0.01

0.01*60 = 0.6

0.6 * 8760 = 5256

5256/100 = $52.56 per year, at constant operation, at full power draw. So yes, you were close. I'll hand you that and trust your Hawaii math.

So... Essentially the money you save, if you live in Hawai, by buying a 7750, would easily cover a new GPU. However, let's try to get some real numbers now. Assuming he turns off his computer every night (The gap for leaving overnight to download is covered by the error margin and the fact the GPU doesn't really do anything when downloading overnight)

So let's chop off 8 hours per day already. That leaves us with 5840 hours.

Further, it's not likely the computer will be used for gaming or high-power rendering tasks all day long, right? So let's chop off roughly 3 hours to account for that. If you think this number is too big, please tell me why and I'll redo the math. I gotta admit I'm quite enjoying this.

So that leaves us at... 4745 hours per day.

Now to remember that GPUs rarely even reach their max TDP. But I may be wrong here, so we'll completely ignore this bit for our calculation.

What about non-power intensive rendering? Let's ignore those as well. Let's assume that all the time we have left is 100% power load. I'll use the Hawaii price of 40 cents per 1000W hours.

We're at $113.88 per year for Hawaii. Still considerable I have to admit. Now, I'll go look for actual benchmarks.

All benchmarks are at high quality settings, using DX11. (Or 10 when concerning cards that only support 10, but both our cards support 11)

The resolutions we'll compare at 1920x1080 and 2560x1600.

For 560 Ti:
1920x1080: 53.77
2560x1600: 31.24

For 5770:
1920x1080: 34.10
2560x1600: 19.78

So for this game, the 560 Ti wins by a MASSIVE margin. The 5770 is still playable at those high settings, but the 560 Ti will easily survive future games while the 5770 will force you to lower graphical settings.

I tried looking up other benchmarks using other games that include both these cards, but I haven't been having much luck there. You know why? Because they're wildly different cards with wildly different performance. I found plenty of benchmarks comparing to 6850 which is a great card. But we can't use those, can we?

So we reached a difference of $113.88 per year in Hawai. Let's look up the average price of electricity in the USA.

First off, in my search I didn't find a single place that said electricity in Hawai is 40 cents per kilowatt hour. I found 27 at worst. Residential sector of course.

Average price, high estimate, seems to be around 20. I didn't do exact math, but please feel free to. And please, cite your sources. I'm not citing mine because I'm not really willing to testify for their accuracy and because I didn't put much effort into looking them up. I'm assuming you have, so we'll go by your sources if they contradict mine.

Now let's calculate how much difference they make at 20 cents per hour...

We can just halve our $113.88 to get $56.94 per year. Double check the math if you don't believe it.

So... While in Hawaii this might be a bit higher... We gotta remember, that the actual power difference per year is going to be a far smaller cost. What we did here is called a conservative estimate. Or better yet, a maximum. At $56.94 per year... I'd say it's worth it.
From what I am reading you are confirming what I said earlier, then deciding that your own figure of 24/7 for a year is to much and cutting it down. I don't reject this as it is likely to be more accurate but it is also necessary to add in the power saved for the life of the card, as well as other ancillary things like reduced need to cool/run the AC as well as future price changes for electricity. If speculation continues to run rampant about Iran as it has been you can almost guarantee a price hike sometime this year.

Also, my rates for Hawaii rates are from the Hawaii Electric Co http://www.heco.com/portal/site/heco/menuitem.508576f78baa14340b4c0610c510b1ca/?vgnextoid=692e5e658e0fc010VgnVCM1000008119fea9RCRD&vgnextchannel=10629349798b4110VgnVCM1000005c011bacRCRD&vgnextfmt=default&vgnextrefresh=1&level=0&ct=article

average for residential is $0.36, that hides the peak electricity prices, which the computer is likely to be on while.


You also didn't really dispute that the 560ti did exactly what I said it did and that is the 7750 either was close or got trounced. I pointed out the 560ti is at least 2 tiers higher than the 7750 and likely 3 tiers higher.

As I stated earlier in the thread, the biggest drawback for the 7750 is going to be the shitty AMD drivers.
29582351c3
.


Check my post again. I edited it once I noticed my mistake. You'll be happy to find what's at the end of it.
Aven Donn
29582351c3
.


Check my post again. I edited it once I noticed my mistake. You'll be happy to find what's at the end of it.
Cool beans. However at the OPs brick and mortar store they sell for the same price. My argument isn't that the 7750 is better graphically but that the marginal cost is in it's favor by saving electricity.
29582351c3
Aven Donn
29582351c3
.


Check my post again. I edited it once I noticed my mistake. You'll be happy to find what's at the end of it.
Cool beans. However at the OPs brick and mortar store they sell for the same price. My argument isn't that the 7750 is better graphically but that the marginal cost is in it's favor by saving electricity.


Perhaps the OP should consider another store. The electricity difference, if the price is the same, is negligible enough. I'd pay that much to not deal with shitty AMD drivers.

The price difference between them is like $40. Just over a whole year of using it under realistic conditions.
Aven Donn
29582351c3
Aven Donn
29582351c3
.


Check my post again. I edited it once I noticed my mistake. You'll be happy to find what's at the end of it.
Cool beans. However at the OPs brick and mortar store they sell for the same price. My argument isn't that the 7750 is better graphically but that the marginal cost is in it's favor by saving electricity.


Perhaps the OP should consider another store. The electricity difference, if the price is the same, is negligible enough. I'd pay that much to not deal with shitty AMD drivers.

The price difference between them is like $40. Just over a whole year of using it under realistic conditions.
You only keep your graphics cards for a year?
29582351c3
Aven Donn
29582351c3
Aven Donn
29582351c3
.


Check my post again. I edited it once I noticed my mistake. You'll be happy to find what's at the end of it.
Cool beans. However at the OPs brick and mortar store they sell for the same price. My argument isn't that the 7750 is better graphically but that the marginal cost is in it's favor by saving electricity.


Perhaps the OP should consider another store. The electricity difference, if the price is the same, is negligible enough. I'd pay that much to not deal with shitty AMD drivers.

The price difference between them is like $40. Just over a whole year of using it under realistic conditions.
You only keep your graphics cards for a year?


No no, you misunderstand. I just mentioned what the $40 difference between them would mean and how significant it actually is. If the OP is paying the same for them... Well I'd go for the 550 Ti. Because the difference we calculated is a high margin and above standard use. The real cost is far lower. Especially if the OP pays even less than 20 cents per kilowatt per hour.

I plan to keep my 560Ti around for at least 3 years. But we'll see how much money I'll have to spend by then.
Aven Donn
29582351c3
Aven Donn
29582351c3
Aven Donn
29582351c3
.


Check my post again. I edited it once I noticed my mistake. You'll be happy to find what's at the end of it.
Cool beans. However at the OPs brick and mortar store they sell for the same price. My argument isn't that the 7750 is better graphically but that the marginal cost is in it's favor by saving electricity.


Perhaps the OP should consider another store. The electricity difference, if the price is the same, is negligible enough. I'd pay that much to not deal with shitty AMD drivers.

The price difference between them is like $40. Just over a whole year of using it under realistic conditions.
You only keep your graphics cards for a year?


No no, you misunderstand. I just mentioned what the $40 difference between them would mean and how significant it actually is. If the OP is paying the same for them... Well I'd go for the 550 Ti. Because the difference we calculated is a high margin and above standard use. The real cost is far lower. Especially if the OP pays even less than 20 cents per kilowatt per hour.

I plan to keep my 560Ti around for at least 3 years. But we'll see how much money I'll have to spend by then.
Lucky. I am stuck using a shitty HD4850 for the indefinite future.
29582351c3
Aven Donn
29582351c3
Aven Donn
29582351c3
Cool beans. However at the OPs brick and mortar store they sell for the same price. My argument isn't that the 7750 is better graphically but that the marginal cost is in it's favor by saving electricity.


Perhaps the OP should consider another store. The electricity difference, if the price is the same, is negligible enough. I'd pay that much to not deal with shitty AMD drivers.

The price difference between them is like $40. Just over a whole year of using it under realistic conditions.
You only keep your graphics cards for a year?


No no, you misunderstand. I just mentioned what the $40 difference between them would mean and how significant it actually is. If the OP is paying the same for them... Well I'd go for the 550 Ti. Because the difference we calculated is a high margin and above standard use. The real cost is far lower. Especially if the OP pays even less than 20 cents per kilowatt per hour.

I plan to keep my 560Ti around for at least 3 years. But we'll see how much money I'll have to spend by then.
Lucky. I am stuck using a shitty HD4850 for the indefinite future.


How the hell does that card cost so close to the 5770?
Aven Donn
29582351c3
Aven Donn
29582351c3
Aven Donn
29582351c3
Cool beans. However at the OPs brick and mortar store they sell for the same price. My argument isn't that the 7750 is better graphically but that the marginal cost is in it's favor by saving electricity.


Perhaps the OP should consider another store. The electricity difference, if the price is the same, is negligible enough. I'd pay that much to not deal with shitty AMD drivers.

The price difference between them is like $40. Just over a whole year of using it under realistic conditions.
You only keep your graphics cards for a year?


No no, you misunderstand. I just mentioned what the $40 difference between them would mean and how significant it actually is. If the OP is paying the same for them... Well I'd go for the 550 Ti. Because the difference we calculated is a high margin and above standard use. The real cost is far lower. Especially if the OP pays even less than 20 cents per kilowatt per hour.

I plan to keep my 560Ti around for at least 3 years. But we'll see how much money I'll have to spend by then.
Lucky. I am stuck using a shitty HD4850 for the indefinite future.


How the hell does that card cost so close to the 5770?
I don't know. It's what I got and I can't afford to replace it a few more years it seems.

Quick Reply

Submit
Manage Your Items
Other Stuff
Get GCash
Offers
Get Items
More Items
Where Everyone Hangs Out
Other Community Areas
Virtual Spaces
Fun Stuff
Gaia's Games
Mini-Games
Play with GCash
Play with Platinum