This is a split board - You can return to the Split List for other boards.

Why are pre-rendered cutscenes used in 2013?

#1ThePCElitistPosted 11/1/2013 7:56:26 AM
They always look atrocious compared to the actual in game graphics. After playing BF4 on ultra settings it's incredibly noticeable when it transitions to a pre- rendered cutscene running at 720p and 30 fps. The graphics are good enough that they could have used the in-game engine for cutscenes.
---
When I'm Miqo'te
http://www.youtube.com/watch?v=h3LGf9SSWrU
#2Knighted DragonPosted 11/1/2013 7:57:50 AM
It's only a problem on console ports from what I've seen, but it is pretty funny when a game looks ten times worse in the cutscene than in the game.
---
Matthew 7:21
http://img15.imageshack.us/img15/2492/gyenyame.jpg
#3WinpusPosted 11/1/2013 7:59:25 AM
I was playing XCOM the other day and wondering that same question myself. The game I played that it was most jarring for me was Arkham Asylum, but it's the same effect you describe in BF4. Those scenes also take up way more storage space.

The only reason I can think of is because consoles can't handle the scenes at a consistent frame rate, or at all, and they package the scene with all versions just for simplicity.
#4LostHisHardcorePosted 11/1/2013 8:03:13 AM
Wut? BF4 uses cutscenes that have a lower resolution than the actual game? I thought that only happened with HD ports from the PS2 era
---
"What we've got here is a failure to communicate!"
"I'm you, I'm your shadow!"
#5ThePCElitist(Topic Creator)Posted 11/1/2013 8:10:46 AM
LostHisHardcore posted...
Wut? BF4 uses cutscenes that have a lower resolution than the actual game? I thought that only happened with HD ports from the PS2 era


It still happens quite a bit. The cutscenes in all the Mass Effect games were terrible as well.
---
When I'm Miqo'te
http://www.youtube.com/watch?v=h3LGf9SSWrU
#6SmakkyofacePosted 11/1/2013 8:21:31 AM
Batman Arkham Origins and BF4 have them and it makes me sad.
---
Xbox Live / PSN / Steam: Smakkyoface
i7 3770k @ 4.66ghz | 48GB DDR3 1866 (40GB RamDisk) | SLI Evga Gtx 670 FTW | Samsung 256GB PRO | 1TB Maxtor 7200rpm RAID 0
#7Knighted DragonPosted 11/1/2013 8:50:37 AM
Alan Wake was pretty bad about this too, when the cut scene ends everything jumps back into focus and looks a hundred times better
---
Matthew 7:21
http://img15.imageshack.us/img15/2492/gyenyame.jpg
#8El_KazPosted 11/1/2013 8:50:54 AM
If a cutscene would require the creation of a bunch of new models/objects/textures and animate them, it's much cheaper to pre-render it.
If it only uses assets that are in the game already.. then yeah, it's silly.
---
Wait... what?
#9animanganimePosted 11/1/2013 9:10:47 AM
Because alot of times like in BF4 for example they're loading screen in disguise. And sometimes it is easier to do a pre-rendered cutscenes if that scene has quick locations switching. If the scene will only take place in one place and if the asset of that place is already loaded, or will be used afterward for the level anyway then it will be real time.
---
WC i5 2500k @ 4.4 ghz, 2x GTX 670 DirectCU II, Samsung 830 128GB SSD, 16GB G.Skill 1600 DDR3, Zalman 1000W PSU, Cosmos 1000 Case, HP LP3065 @ 2560 x 1600 IPS
#10PhilOnDezPosted 11/1/2013 9:13:34 AM
A big reason for it is when a scene involves a lot of different locations, it isn't efficient to try and render all of that in real time, it would take a lot of ram to have every location rendered ahead of time which is the biggest limiting factor on the current consoles. With most (all?) games being 32 bit it's probably also a factor in the PC versions, though not to the extent that it is on the consoles.
---
Every time I try to go where I really wanna be it's already where I am, 'cuz I'm already there
XBL, PSN, Steam, Origin, BSN, GFAQs, MC: PhilOnDez