Are they just using offline rendering or are they able to pair multiple RTX 2080 Ti together or something?
Apparently SLI/Crossfire Is Dead. So What Do Game Devs Use to Render Their "In Game" Trailers That The Game Never Matches?
Comments
Like you guessed, a lot of the stuff you see in trailers is already pre-rendered. It's a process called 'baking which to put simply is pre-computing all the lighting effects, texture, displacement etc, so when you record the footage, all the effects are already in place and it doesn't need to be rendered in real time.
Lighting effects are computationally expensive, so most games still use a combination of baked and dynamic sources in actual gameplay. If you look at Unity engine for instance there are tools for you to bake in light sources and shadows into the map itself.
So i remember when they used to do live demos on stages at conventions for some past games and it would turn out the game was running on like a GTX 1080 SLI setup in real time, is there actually any way for them to use some trickery in live demos to get more than the performance of the current king of the cards the RTX 2080 Ti?
Oh you mean actual gameplay trailers! I think everyone thought you meant cinematic trailers.
I guess i was thinking of both but didnt make that real clear.
So with the live gameplay stuff where it cant be recorded, they cant do it with anything other than what us (rich) peasants could get, like a RTX 2080 Ti and 9900k or something?@lordezekiel: I can't really speak for the actual live-demo type presentations where someone actually does a hand-on run through of a section of the game.
However, there have been plenty of instances where "gameplay" trailers turned out to be (or are suspected to be) prerendered animations that simply look like gameplay videos. E.g. Anthem, Aliens: Colonial Marines
It's possible in many of these instances (I'm thinking of some notable downgrades like Watch Dogs and so on) they did do what you are saying i.e. demo the games on insane hardware just to achieve a playable framerate for presentation.
Yeah and i guess a single $1300+ card is still just as out of reach to 99.9% of people as an older high end sli setup they might have a few years back.
@lordezekiel: That's true, especially since cards have gone up a lot so 1 top tier card now probably costs as much as two back in the day. Of course, if you were going to demo a game on PC hardware it makes sense to go with top tier stuff to show your game in the best light possible, even if it still runs well with mid-range and budget gear.
One thing they might be able to do is streamline performance a bit more for the actual demo hardware since they can choose that, in the same way that platform-exclusive games tend to look a fair bit better because developers only have to develop for one platform and can code closer to the metal.@Deditus: You know that's a good point about them possibly being able to tailor performance of the live demo around a specific hardware setup, guess i hadn't thought of that side of things.
@lordezekiel: Some of those live demos were shown using a beefy gaming pc, whereas the target game was for a much cheaper/weaker console. The famous example (of over-promising) is Watch Dogs and No Man's Sky.
@Kangal: That's right - I read that they believed if the consoles didn't look as good as PC the game wouldn't sell on console well, so they "forced parity" by downgrading the PC version.
those E3 Stage demo gaming rigs are not your typical gaming rigs. They are most likely using the very best GPU's, sometimes RTX Titan if they really want to flex just how good their games can look with real-time Raytracing turned on.
Universally speaking unless it is a developers' focused demo where they are showcasing a new game engine for e.g they never talk about what kind of hardware the demo is being run on.
Render in 1 fps then fast-forward to 60 fps later? Like a slomo but in reverse.
Haha i like that idea :P
Like other commenters have said, they're not rendered in real time. So while in a game everything is happening in real time so things need to be rendered on the fly (ideally you want to be pushing out 60+ frames a second), you can take as long as you like to generate a single frame of an animation. I read that in the Pixar movie Monsters University, a single frame took 29 hours to render on a cluster of 2000 computers.
You're right but I think your numbers might be off.. if it's 29 hours per frame then for an hour movie @ 24fps you're looking at 286 years to render.
They're not rendered in real time, so i.e. just animations.