Why is it so difficult to make a console that does 1080p and 120fps?

#51FoppePosted 9/2/2014 3:20:51 AM
All consoles this generation is powerful to do 1080p120.
The point is, people are so ignorant these days so they wouldn't accept the graphical downstep required to make the game run nice.
GameFAQs isn't going to be merged in with GameSpot or any other site. We're not going to strip out the soul of the site. -CJayC
#52DojoMaxPosted 9/2/2014 3:33:01 AM
Dieinafire1 posted...
Sith Jedi looks like those two games are tied based on digital foundry. I wanted to like both of them when I played but they are c c+ games even on pc

But didn't you say 'every' game?
https://www.youtube.com/watch?v=cVN_Ytl_bgA -- Titanfall is better on 360 than XBone.
#53MetroidFan9999Posted 9/2/2014 7:02:26 AM
a687947 posted...
It can be done with the current and even last gen consoles. There would be huge sacrifices though to reach 120fps at 1080p

This, why do people continue making these dumb topics?

There is nothing preventing developers from doing this, people would just be wondering why games look like last gen.
For a good laugh - http://misterxmedia.livejournal.com/
#54Dieinafire1Posted 9/2/2014 1:10:27 PM
DojoMax yes I didn't say every game and I still haven't seen a game that is clearly better on Xbox1. Yet some games are leaps and bounds better on the ps4.
| AMD FX-8350 @ 4.4 ghz | MSI 990FXA-GD65 V2 | EVGA SLI TITANS | 16GB 1866 DDR3 | 850W Corsair | Win8.1 64bit | 3TB HDD 256 SSD | Hyper 212 Evo | 4K | PS4|Xbox1
#55SolisPosted 9/3/2014 1:30:34 AM
known2FAIL posted...
Solis posted...
Nearly any console released in the last decade could run at 1080p and 120fps (Wii notwithstanding). However, since none of them support higher than 60hz output on account of typical HDTV limitations, it would be pointless for them to run at that framerate, so 60FPS is pretty much the highest you'll ever see any of them run at. It should be possible for a firmware update to allow 1080p at 120hz, but since there's minimal consumer demand for it, I doubt we'll ever see it.

Cost doesn't have anything to do with it, though. There are no physical limitations that would prevent running a 1080p at 120hz, it's purely an issue of expectations and firmware restrictions.

Intresting. Now I'm wondering.... Even LIVE TV is shown in FPS correct? Why are TV developers concentrating on increasing the quality instead of increasing the Hzs?

Live TV shows can be 24, 30, or 60FPS, but no show that I know of was recorded or mastered at 120FPS. Movies of course typically run at 24FPS, but some more recent productions (most notable The Hobbit) and IMAX movies run at 48FPS. With actual filmed material, it becomes much more expensive to record and store at higher framerates, and not many people would probably find much benefit in video that runs at 120FPS, so there's likely very little interest in the industry to go above 60hz for filming.

Games and computer applications in general on the other hand can benefit from the higher framerates since the artifacts that hide higher framerates in filmed or pre-rendered material (most notably motion blur) are rarely used to full effect, and framerates affect the response time of actions, so in that case it can actually have a benefit. Just like how 24FPS might seem fine for movies, but it would be absolutely awful for just something as simple as using a mouse-driven UI with.
"Walking tanks must exist somewhere for there to be such attention to detail like this in mech sim." - IGN Steel Battalion review
#56chrcolPosted 9/3/2014 5:40:23 AM
the question is why do you need 1080 and 120fps?

the answer to your question is cost.