oh crap, fanboy! rumor: DX12 will upgrade X1 GPU to 900Mhz

#31N3xtG3nGam3rPosted 8/17/2014 7:47:38 PM
Ok, nobody responded to my post, so i will elaborate, and entertain the thought that this could be a possibility.

Heres a link to the topic where it was explained how the API can reduce power consumption and keep the same performance. In theory i guess that means since the power used is cut in half, and results in the same performance, that doubling the power back to 100%, would lead to a 50% increase in FPS. That isnt very realistic, and although it is ''theoretical'', ''in-practice'' is all that matters, and theoretical numbers rarely ever stand up to real-world performance.

http://www.developer-tech.com/news/2014/aug/13/directx-12-boosts-fps-50-and-cuts-power-consumption-half/

Now, assuming they are able to reduce power consumption by 50%, and keep performance the same, that would ''theoretically'' mean they could boost the clocks on the GPU, so that it performs within the same temperature threshold as before.

In this scenario, yes, they could definitely boost the clocks and ''OC'' the GPU safely.

Now, if this all is true, i wouldn't expect to see only a 50Mhz boost. I would say they could do a 10-15% boost (which translates to 80-120Mhz) and be perfectly fine temperature wise. Will they go that high? Considering that we are working with a console, which everybody has the same setup, and they need to make sure that the consoles everyone has, will run those clocks, and run them for the next 7-8 years, it is possible that they would only boost it 50Mhz.

Ok, now....what exactly does 50Mhz do? Not much. You are looking at maybe a 2-3fps boost--generally speaking. On a PC GPU with a clock of 850Mhz, OC'ing to 900Mhz would only yield a few FPS. It might be different on consoles since the API is specific, but i would say thats probably what you'd be looking at.

In correlation with the original 50Mhz boost, the 10% boost from dropping mandatory Kinect power allocation, and then this 50Mhz boost (if true) is fairly impressive from a console perspective. Typically things like this never happen with consoles. API updates, firmware, and OS tweaks are normally about all a console gets. I think the reason MS was able to boost the clock initially, was because they designed the console to be turned on one time, and be left on for 10 years. Now that their ''always on'' plan is out the window, they have a little bit more leeway in terms of how hard the consoles can be expected to be pushed by the user.
---
ASUS p8h61-M (Rev 3.0) | Intel CORE i3 2100 | 8GB Dual-Channel DDr3 | 500GB HDD | 600w PSU | nVidia GTX 770 4GB GDDr5
#32cory1225Posted 8/17/2014 7:55:53 PM
RCW29 posted...
cory1225 posted...
But almost none of the exclusives are as fun as the ones on xbox. Truth.


This is the very definition of an opinion, fanboy. I don't care what you own or claim to own, if you have that opinion, you are a fanboy. Truth.


I'm not a fan boy hahaha I love both consoles equally, even have a gaming pc to mess around with. I just feel like Ms exclusives are more arcadey and fun imo
---
https://www.youtube.com/watch?v=BKJPwk_8mJU
Gt= oH BluRRie Psn = oH_BluRRie
#33Xeeh_BitzPosted 8/17/2014 8:02:07 PM
So 10% boost to the GPU, instead of 30 fps, it's 33 fps
---
3770K | 780 Ti x 2
Steam: Xeeh Origin: TurboPeasant
#34NeoMonkPosted 8/17/2014 8:05:29 PM
Xeeh_Bitz posted...
So 10% boost to the GPU, instead of 30 fps, it's 33 fps


Umm... haven't you heard!
%10 boost means 1080P & 60 FPS steady on all games!
---
"The Xbox One board isn't the place for personal anecdotes, joke topics or fanboy affair." Gamefaqs Moderator
#35cory1225Posted 8/17/2014 8:12:07 PM(edited)
Apex-Player posted...
Cory1225
Lol you played through infamous SS twice for a platinum. Couldn't be that bad.
Also at least 2 playthroughs (if you glitched) for TLoU platinum.
Not to mention like 150 rounds of TLoU multiplayer.

Have you 1000/1000 all the xboxone games? They were fun enough, right?

You also buy the multiplat titles on ps4, like MSS and UFC so i'll say good on you for that.


I never said the ps4 exclusives were bad lol
I loved the TLOU, it's a great game.
Both systems are great and each system has it's perks.
For instance the ps4 party system is closer to the 360's and did a better job than XB1's party chat. On the other hand XB1 has a better friends list interface imo. Either way people are shorting themselves by not owning both down the line.
---
https://www.youtube.com/watch?v=BKJPwk_8mJU
Gt= oH BluRRie Psn = oH_BluRRie
#36LaManoNeraIIPosted 8/17/2014 8:11:40 PM
OpheliaAdenade posted...
You can't get blood out of a turnip. You can overclock that GPU all you like, it isn't going to turn it into a nicer GPU.


It would be enough to bump 900p to 1080p to get that extra useless bullet point
---
R.I.P LaManoNera
04-06-2009
#37Apex-PlayerPosted 8/17/2014 8:17:11 PM
cory1225 posted...
Apex-Player posted...
Cory1225
Lol you played through infamous SS twice for a platinum. Couldn't be that bad.
Also at least 2 playthroughs (if you glitched) for TLoU platinum.
Not to mention like 150 rounds of TLoU multiplayer.

Have you 1000/1000 all the xboxone games? They were fun enough, right?

You also buy the multiplat titles on ps4, like MSS and UFC so i'll say good on you for that.


I never said the ps4 exclusives were bad lol
I loved the TLOU, it's a great game.
Both systems are great and each system has it's perks.
For instance the ps4 party system is closer to the 360's and did a better job than XB1's party chat. On the other hand XB1 has a better friends list interface imo. Either way people are shorting themselves by not owning both down the line.


I was mostly just curious if you 1000/1000 all the xboxone games you preferred?

I can see the ps4 games you platinumed must have been fun enough to keep you hooked until completion.

Did the xboxone games hook you until 1000/1000 as well?
---
Unbearable is it? The suffering of strangers, the agony of friends.
There's a secret song at the center of the world and its sound is like razors through flesh
#38RCW29Posted 8/17/2014 8:27:38 PM
LaManoNeraII posted...
OpheliaAdenade posted...
You can't get blood out of a turnip. You can overclock that GPU all you like, it isn't going to turn it into a nicer GPU.


It would be enough to bump 900p to 1080p to get that extra useless bullet point


Even if it were possible, which it isn't.. and its laughable, it would run at 1080p for about 10 minutes, if at all, and then GPU would catch fire. If MS was stupid enough to try to overclock the GPU by 900mhz, they are going to have a system melt down problem that made the RROD seem like a mild nuisance.
#39SculptorOvFleshPosted 8/17/2014 8:31:21 PM
RCW29 posted...
LaManoNeraII posted...
OpheliaAdenade posted...
You can't get blood out of a turnip. You can overclock that GPU all you like, it isn't going to turn it into a nicer GPU.


It would be enough to bump 900p to 1080p to get that extra useless bullet point


Even if it were possible, which it isn't.. and its laughable, it would run at 1080p for about 10 minutes, if at all, and then GPU would catch fire. If MS was stupid enough to try to overclock the GPU by 900mhz, they are going to have a system melt down problem that made the RROD seem like a mild nuisance.


Bumping it 900mhz is just plane stupid. Go bump your GPU by 900mhz, let me know how that works out.
50-200 is more then acceptable.
#40RCW29Posted 8/17/2014 8:38:36 PM
SculptorOvFlesh posted...
RCW29 posted...
LaManoNeraII posted...
OpheliaAdenade posted...
You can't get blood out of a turnip. You can overclock that GPU all you like, it isn't going to turn it into a nicer GPU.


It would be enough to bump 900p to 1080p to get that extra useless bullet point


Even if it were possible, which it isn't.. and its laughable, it would run at 1080p for about 10 minutes, if at all, and then GPU would catch fire. If MS was stupid enough to try to overclock the GPU by 900mhz, they are going to have a system melt down problem that made the RROD seem like a mild nuisance.


Bumping it 900mhz is just plane stupid. Go bump your GPU by 900mhz, let me know how that works out.
50-200 is more then acceptable.


You don't have to tell me that, tell that to TC and whatever nimrod who made up that ridiculous "rumor". Besides, that isn't even what DirectX does.