• JakJak98@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 day ago

      I feel like bloom depends on how intense it is, and if it makes sense to reasonably play the game.

      Like, if it’s the sun, yeah, bloom is OK.

      If it’s anything else? Pass.

  • Yaarmehearty@lemmy.ml
    link
    fedilink
    arrow-up
    7
    ·
    1 day ago

    The preference against DOF is fine. However, I’m looking at my f/0.95 and f/1.4 lenses and wondering why it’s kind of prized in photography for some genres and hated in games?

    • ne0phyte@feddit.org
      link
      fedilink
      arrow-up
      14
      ·
      1 day ago

      It is unnatural. The focus follows where you are looking at. Having that fixed based on the mouse/center of the screen instead of what my eyes are doing feels so wrong to me.

      I bet with good eye tracking it would feel different.

      • Yaarmehearty@lemmy.ml
        link
        fedilink
        arrow-up
        7
        ·
        1 day ago

        That makes sense, if you can’t dynamically control what is in focus then it’s taking a lot of control away from the player.

        I can also see why a dev would want to use it for a fixed angle cutscene to create subject separation and pull attention in the scene though.

  • Baguette@lemm.ee
    link
    fedilink
    arrow-up
    29
    arrow-down
    2
    ·
    2 days ago

    Depth of field and chromatic aberration are pretty cool if done right.

    Depth of field is a really important framing tool for photography and film. The same applies to games in that sense. If you have cinematics/cutscenes in your games, they prob utilize depth of field in some sense. Action and dialogue scenes usually emphasize the characters, in which a narrow depth of field can be used to put focus towards just the characters. Meanwhile things like discovering a new region puts emphasis on the landscape, meaning they can use a large depth of field (no background blur essentially)

    Chromatic aberration is cool if done right. It makes a little bit of an out of place feel to things, which makes sense in certain games and not so much in others. Signalis and dredge are a few games which chromatic aberration adds to the artstyle imo. Though obviously if it hurts your eyes then it still plays just as fine without it on.

    • justastranger@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      Chromatic aberration is also one of the few effects that actually happens with our eyes instead of being an effect designed to replicate a camera sensor.

    • ysjet@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      I feel like depth of field and motion blur have their place, yeah. I worked on a horror game one time, and we used a dynamic depth of field- anything you were looking at was in focus, but things nearer/farther than that were slightly blurred out, and when you moved where you were looking, it would take a moment (less than half a second) to ‘refocus’ if it was a different distance from the previous thing. Combined with light motion blur, it created a very subtle effect that ratcheted up anxiety when poking around. When combined with objects in the game being capable of casting non-euclidean shadows for things you aren’t looking at, it created a very pervasive unsettling feeling.

    • ShortFuse@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      2 days ago

      Most “film grain” is just additive noise akin to digital camera noise. I’ve modded a bunch of games for HDR (RenoDX creator) and I strip it from almost every game because it’s unbearable. I have a custom film grain that mimic real film and at low levels it’s imperceptible and acts as a dithering tool to improve gradients (remove banding). For some games that emulate a film look sometimes the (proper) film grain lends to the the look.

      • kautau@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 day ago

        Agreed. It fits very well in very specific places, but when not there, it’s just noise

  • Psythik@lemm.ee
    link
    fedilink
    arrow-up
    14
    arrow-down
    1
    ·
    2 days ago

    Hating on hair quality is a new one for me. I can understand turning off Ray Tracing if you can have a low-end GPU, but hair quality? It’s been at least a decade since I’ve last heard people complaining that their GPU couldn’t handle Hairworks. Does any game even still use it?

  • sp3ctr4l@lemmy.zip
    link
    fedilink
    English
    arrow-up
    49
    ·
    2 days ago

    Now… in fairness…

    Chromatic abberation and lense flares, whether you do or don’t appreciate how they look (imo they arguably make sense in say CP77 as you have robot eyes)…

    … they at least usually don’t nuke your performance.

    Motion blur, DoF and ray tracing almost always do.

    Hairworks? Seems to be a complete roll of the dice between the specific game and your hardware.

    • Johanno@feddit.org
      link
      fedilink
      arrow-up
      8
      ·
      2 days ago

      I love it when the hair bugs out and covers the whole distance from 0 0 0 to 23944 39393 39

    • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 days ago

      Motion Blur and depth of field has almost no impact on performance. Same with Anisotropic Filtering and I can not understand why AF isn’t always just defaulted to max, since even back in the golden age of gaming it had no real performance impact on any system.

      • sp3ctr4l@lemmy.zip
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        edit-2
        2 days ago

        You either haven’t been playing PC games very long, or aren’t that old, or have only ever played on fairly high end hardware.

        Anisotropic filtering?

        Yes, that… hasn’t been challenging for an affordable PC an average person has to run at 8x or 16x for … about a decade. That doesn’t cause too much framerate drop off at all now, and wasn’t too much until you… go all the way back to the mid 90s to maybe early 2000s, when ‘GPUs’ were fairly uncommon.

        But that just isn’t true for motion blur and DoF, especially going back further than 10 years.

        Even right now, running CP77 on my steam deck, AF level has basically no impact on my framerate, whereas motion blur and DoF do have a noticable impact.

        Go back even further, and a whole lot of motion blur/DoF algorithms were very poorly implemented by a lot of games. Nowadays we pretty much get the versions of those that were not ruinously inefficient.

        Try running something like Arma 2 with a mid or low range PC with motion blur on vs off. You could get maybe 5 to 10 more fps having it off… and thats a big deal when you’re maxing out at 30 to 40ish fps.

        (Of course now we also get ghosting and smearing from framegen algos that ironically somewhat resemble some forms of motion blur.)

        • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 days ago

          I am 40 and have been gaming on PC my entire life.

          Try running something like Arma 2 with a mid or low range PC with motion blur on vs off. You could get maybe 5 to 10 more fps having it off… and thats a big deal when you’re maxing out at 30 to 40ish fps.

          Arma is a horrible example, since it is so poorly optimized, you actually get a higher frame rate maxing everything out compared to running everything on low. lol

          • sp3ctr4l@lemmy.zip
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            2 days ago

            If you’re 40 and have been PC gaming your whole life, then I’m going with you’ve had fairly high end hardware, and are just misremembering.

            Arma 2 is unoptimized in general… but largely thats because it basically uses a massive analog to a pagefile on your HDD because of how it handles its huge environments in engine. Its too much to jam through 32 bit OSs and RAM.

            When SSDs came out, that turned out to be the main thing that’ll boost your FPS in older Arma games, because they have much, much faster read/write speeds.

            … But, their motion blur is still unoptimized and very unperformant.

            As for setting everything to high and getting higher FPS… thats largely a myth.

            There are a few postprocessing settings that work that way, and thats because in those instances, the ‘ultra’ settings actually are different algorithms/methods, that are both less expensive and visually superior.

            It is still the case that if you set texture, model quality to low, grass/tree/whatever draw distances very short, you’ll get more frames than with those things maxxed out.

  • yonder@sh.itjust.works
    link
    fedilink
    arrow-up
    106
    ·
    2 days ago

    Out of all of these, motion blur is the worst, but second to that is Temporal Anti Aliasing. No, I don’t need my game to look blurry with every trailing edge leaving a smear.

    • sp3ctr4l@lemmy.zip
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      edit-2
      2 days ago

      TAA is kind of the foundation that almost all real time EDIT: raytracing frame upscaling and frame generation are built on, and built off of.

      This is why it is increasingly difficult to find a newer, high fidelity game that even allows you to actually turn it off.

      If you could, all the subsequent magic bullshit stops working, all the hardware in your GPU designed to do that stuff is now basically useless.

      EDIT: I goofed, but the conversation thus far seems to have proceeded assuming I meant what I actually meant.

      Realtime raytracing is not per se foundationally reliant on TAA, DLSS and FSR frame upscaling and later framgen tech however basically are, they evolved out of TAA.

      However, without the framegen frame rate gains enabled by modern frame upscaling… realtime raytracing would be too ‘expensive’ to implement on all but fairly high end cards / your average console, without serious frame rate drops.

      Befor Realtime raytracing, the paradigm was that all scenes would have static light maps and light environments, baked into the map, with a fairly small number of dynamic light sources and shadows.

      With Realtime raytracing… basically everything is now dynamic lights.

      That tanks your frame rate, so Nvidia then barrelled ahead with frame upscaling and later frame generation to compensate for the framerate loss that they introduced with realtime raytracing, and because they’re an effective monopoly, AMD followed along, as did basically all major game developers and many major game engines (UE5 to name a really big one).

      • Vlyn@lemmy.zip
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        2 days ago

        What? All Ray Tracing games already offer DLSS or FSR, which override TAA and handle motion much better. Yes, they are based on similar principles, but they aren’t the mess TAA is.

        • sp3ctr4l@lemmy.zip
          link
          fedilink
          English
          arrow-up
          9
          ·
          2 days ago

          Almost all implementations of DLSS and FSR literally are evolutions of TAA.

          TAA 2.0, 3.0, 4.0, whatever.

          If you are running DLSS or FSR, see if your game will let you turn TAA off.

          They often won’t, because they often require TAA to be enabled before DLSS or FSR can then hook into them and extrapolate from there.

          Think of TAA as a base game and DLSS/FSR as a dlc. You very often cannot just play the DLC without the original game, and if you actually dig into game engines, you’ll often find you can’t run FSR/DLSS without running TAA.

          There are a few exceptions to this, but they are rare.

          • Vlyn@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            TAA just means temporal anti aliasing. Temporal as in relying on data from the previous frames.

            The implementation of DLSS and FSR are wholly separate from the old TAA. Yes, they work on the same principals, but do their own thing.

            TAA as a setting gets disabled because the newer methodes fully overwrite it. Some games hide the old setting, others gray it out, it depends.

            • sp3ctr4l@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              24 hours ago

              The implementation of DLSS and FSR are wholly separate from the old TAA. Yes, they work on the same principals, but do their own thing.

              TAA as a setting gets disabled because the newer methodes fully overwrite it.

              This is very often false.

              DLSS/FSR need per pixel motion vectors, or at least comparisons, between frames, to work.

              TAA very often is the thing that they get those motion vectors from… ie, they are dependent on it, not seperate from it.

              Indeed, in many games, significant other portions/features of a game’s graphical engine bug out massively when TAA is manually disabled, which means these features/portions are also dependent on TAA.

              Sorry to link to the bad site, but:

              https://www.reddit.com/r/FuckTAA/comments/motdjd/list_of_known_workarounds_for_games_with_forced/

              And here’s all the games that force TAA which no one has yet figured out how to disable:

              https://www.reddit.com/r/FuckTAA/comments/rgxy44/list_of_games_with_forced_taa/

              Please go through all of these and notice how many modern games:

              1. Do not allow the user to turn off TAA easily, forcing them to basically mod the game by manually editing config files or more extensive workarounds.

              2. Don’t even tell the user that TAA is being used, requiring them to dig through the game to discover that it is.

              3. When TAA is manually disabled, DLSS/FSR breaks, or other massive graphical issues crop up.

              TAA is the foundational layer that many modern games are built on… because DLSS/FSR/XeSS and/or other significant parts of the game’s graphical engine hook into the pixel motion per frame comparisons that are done by TAA.

              The newer methods very often do not overwrite TAA, they are instead dependent on it.

              Its like trying to run or compile code that is dependent on a library you don’t actually have present… it will either fail entirely, or kind of work, but in a broken way.

              Sure, there are some instances where DLSS/FSR is implemented in games, in a way that is actually its whole own, self contained pipeline… but very often, this is not the case, TAA is a dependency for DLSS/FSR or other graphical features of the game engine.

              TAA is massively different that older MSAA or FXAA or SMAA kinds of AA… because those don’t compare frames to previous frames, they just apply an effect to a single frame.

              TAA provides ways of comparing differences in sequences of frames, and many, many games use those methods to feed into many other graphical features that are built on top of, and require those methods.

              To use your own words: TAA is indeed a mess, and you apoarently have no idea how foundational this mess is to basically all the new progression of heavily marketed, ‘revolutionary’ graphical rendering techniques of the past 5 ish years.

    • CanadaGeese@lemmy.zip
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      2 days ago

      Honestly motion blur done well works really well. Cyberpunk for example does it really well on the low setting.

      Most games just dont do it well tho 💀

  • Artyom@lemm.ee
    link
    fedilink
    arrow-up
    124
    arrow-down
    3
    ·
    3 days ago

    Step 1. Turn on ray tracing

    Step 2. Check some forum or protondb and discover that the ray tracing/DX12 is garbage and gets like 10 frames

    Step 3. Switch back to DX11, disable ray tracing

    Step 4. Play the game

    • frezik@midwest.social
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      Best use of ray tracing I’ve seen is to make old games look good, like Quake II or Portal or Minecraft. Newer games are “I see the reflection in the puddle just under the car when I put them side by side” and I just can’t bring myself to care.

      • Artyom@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        Control and Doom Eternal are the only exceptions to this rule I’ve played, but they are very much the exception.

    • ElectroLisa@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      7
      ·
      3 days ago

      If I know a game I’m about to play runs on Unreal Engine, I’m passing a -dx11 flag immediately. It removes a lot of useless Unreal features like Nanite

      • ShinkanTrain@lemmy.ml
        link
        fedilink
        English
        arrow-up
        23
        ·
        2 days ago

        Then you get to enjoy they worst LODs known to man because they were only made as a fallback

      • boletus@sh.itjust.works
        link
        fedilink
        arrow-up
        13
        ·
        2 days ago

        Nanite doesn’t affect any of the post processing stuff nor the smeary look. I don’t like that games rely on it but modern ue5 games author their assets for nanite. All it affects is model quality and lods.

        Lumen and other real time GI stuff is what forces them to use temporal anti aliasing and other blurring effects, that’s where the slop is.

        • sp3ctr4l@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          2 days ago

          Nanite + Lumen run like garbage on anything other than super high end hardware.

          It is also very difficult to tweak and optimize.

          Nanite isn’t as unperformant as Lumen, but its basically just a time saver for game devs, and its very easy for a less skilled game dev to think they are using it correctly… and actually not be.

          But, Nanite + Lumen have also become basically the default for AAA games down to shitty asset flips… because they’re easier to use from a dev standpoint.

  • lime!@feddit.nu
    link
    fedilink
    English
    arrow-up
    79
    arrow-down
    8
    ·
    2 days ago

    motion blur is essential for a proper feeling of speed.

    most games don’t need a proper feeling of speed.

    • Waffle@infosec.pub
      link
      fedilink
      arrow-up
      28
      ·
      2 days ago

      Motion blur is guarenteed to give me motion sickness every time. Sometimes I forget to turn it off on a new game… About 30 minutes in I’ll break into cold sweats and feel like I’m going to puke. I fucking hate that it’s on by default in so many games.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        arrow-up
        7
        ·
        2 days ago

        It really should be a prompt at first start. Like, ask a few questions like:

        • do you experience motion sickness?
        • do you have epilepsy?

        The answers to those would automatically disable certain settings and features, or drop you into the settings.

        It would be extra nice for a platform like PlayStation or Steam to remember those preferences and the game could read them (and display a message so you know it’s doing it).

    • sp3ctr4l@lemmy.zip
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      2 days ago

      … What?

      I mean… the alternative is to get hardware (including a monitor) capable of just running the game at an fps/hz above roughly 120 (ymmv), such that your actual eyes and brain do real motion blur.

      Motion blur is a crutch to be able to simulate that from back when hardware was much less powerful and max resolutions and frame rates were much lower.

      At highet resolutions, most motion blur algorithms are quite inefficient and eat your overall fps… so it would make more sense to just remove it, have higher fps, and experience actual motion blur from your eyes+brain and higher fps.

      • lime!@feddit.nu
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 days ago

        my basis for the statement is beam.ng. at 100hz, the feeling of speed is markedly different depending on whether motion blur is on. 120 may make a difference.

      • AdrianTheFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        2 days ago

        You still see doubled images instead of a smooth blur in your peripheral vision I think when you’re focused on the car for example in a racing game.

    • pyre@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      2 days ago

      yeah the only time I liked it was in need for speed when they added nitro boost. the rest of the options have their uses imo I don’t hate them.

  • Soapbox1858@lemm.ee
    link
    fedilink
    English
    arrow-up
    21
    ·
    2 days ago

    I don’t mind a bit of lens flare, and I like depth of field in dialog interactions. But motion blur and chromatic aberration can fuck right off.

      • kerrigan778@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        23 hours ago

        I mean, lens flare does happen in the eye, just much less dramatically because there’s only the one lens and everything is round. But “glare” like how the rest of your sight gets washed out because the sun is in your field of view is a manifestation of lens flare. The eyelashes can also produce some weird light artifacts that resemble camera lens flares but it’s a different phenomenon.

      • Soapbox1858@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        That’s fair. I usually turn it off for FPS games. But if it’s mild, I leave it on for third person games where I am playing as a camera.

  • zipzoopaboop@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    2 days ago

    I don’t understand who decided that introducing the downfalls of film and camera made sense for mimicking the accuracy and realism of the human eye

  • DaddleDew@lemmy.world
    link
    fedilink
    arrow-up
    74
    ·
    3 days ago

    Has the person who invented the depth of field effect for a video game ever even PLAYED a game before?

    • taiyang@lemmy.world
      link
      fedilink
      arrow-up
      39
      ·
      3 days ago

      I mean, it works in… hmmm… RPGs, maybe?

      When I was a kid there was an effect in FF8 where the background blurred out in Balamb Garden and it made the place feel bigger. A 2D painted background blur, haha.

      Then someone was like, let’s do that in the twenty-first century and ruined everything. When you’ve got draw distance, why blur?

      • DaddleDew@lemmy.world
        link
        fedilink
        arrow-up
        35
        ·
        3 days ago

        Yes, it makes sense in a game where the designer already knows where the important action is and controls the camera to focus on it. It however does not work in a game where the action could be anywhere and camera doesn’t necessarily focus on it.

        • taiyang@lemmy.world
          link
          fedilink
          arrow-up
          9
          ·
          2 days ago

          Yup, or if they’re covering up hardware deficiency, like Nintendo sometimes does. And even then, they generally prefer to just make everything a little fuzzy, like BotW.

    • shneancy@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      2 days ago

      it works great for games that have little to no combat, or combat that’s mostly melee and up to like 3v1. or if it’s a very slight DOF that just gently blurs things far away

      idk what deranged individual plays FPS games with heavy DOF though

    • alaphic@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      3 days ago

      Well, not exactly, but they were described to him once by an elderly man with severe cataracts and that was deemed more than sufficient by corporate.

    • 11111one11111@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      32
      ·
      3 days ago

      What is the depth of field option? When it’s on what happens vs when it’s off?

      Side question, why the fuck does everything in IT reuse fucking names? Depth of field means how far from character it’ll render the environment, right? So if the above option only has an on or off option then it is affecting something other than the actual depth of field, right? So why the fuck would the name of it be depth of fucking field??? I see this shit all the time as I learn more and more about software related shit.

      • tehmics@lemmy.world
        link
        fedilink
        arrow-up
        35
        ·
        3 days ago

        No.

        Depth of field is when backgroud/foreground objects get blurred depending on where you’re looking, to simulate eyes focusing on something.

        You’re thinking of draw distance, which is where objects far away aren’t rendered. Or possibly level of detail (LoD) where distant objects will be changed to a lower detailed model as they get further away.

      • StitchIsABitch@lemmy.world
        link
        fedilink
        arrow-up
        11
        ·
        3 days ago

        In this context it just refers to a post processing effect that blurs certain objects based on their distance to the camera. Honestly it is one of the less bad ones imo, as it can be well done and is sometimes necessary to pull off a certain look.

      • DaddleDew@lemmy.world
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        edit-2
        3 days ago

        When it’s on, whatever the playable character looks at will be in focus and everything else that is at different distances will be blurry, as it would be the case in real life if your eyes were the playable character’s eyes. The problem is that the player’s eyes are NOT the playable character’s eyes. Players have the ability to look around elsewhere on the screen and the vast majority of them use it all the time in order to play the game. But with that stupid feature on everything is blurry and the only way to get them in focus is to move the playable character’s view around along with it to get the game to focus on it. It just constantly feels like something is wrong with your eyes and you can’t see shit.

        • Zozano@aussie.zone
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          2 days ago

          It’s like motion blur. Your eyes already do that, you don’t need it to be simulated…

          • Buddahriffic@lemmy.world
            link
            fedilink
            arrow-up
            6
            arrow-down
            1
            ·
            2 days ago

            For depth of field, our eyes don’t automatically do that for a rendered image. It’s a 2d image when we look at it and all pixels are the same distance and all are in focus at the same time. It’s the effect you get when you look at something in the distance and put your finger near your eye; it’s blurry (unless you focus on it, in which case the distant objects become blurry).

            Even VR doesn’t get it automatically.

            It can feel unnatural because we normally control it unconsciously (or consciously if we want to and know how to control those eye muscles at will).

          • FooBarrington@lemmy.world
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            2 days ago

            No, your eyes can’t do it on a screen. The effect is physically caused by the different distances of two objects, but the screen is always the same distance from you.

            • Zozano@aussie.zone
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              2 days ago

              Yes, but you still get the blurry effect outside of the spot on the screen you’re focused on.

              • FooBarrington@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                2 days ago

                Not in the same way. Our eyes have lower resolution away from the center, but that’s not what’s causing DoF effects. You’re still missing the actual DoF.

                If the effect was only caused by your eye, the depth wouldn’t matter, but it clearly does.

                • Zozano@aussie.zone
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  2 days ago

                  Yeah I get it, I’m just saying it’s unnecessary. If I need to see what’s going on in the background, then my eyes should be able to focus on it.

                  There are very few scenarios where DoF would be appropriate (like playing a character who lost their glasses).

                  Like chromatic aberration, which feels appropriate for Cyberpunk, since the main character gets eye implants and fits the cyberpunk theme.

          • SitD@lemy.lol
            link
            fedilink
            arrow-up
            4
            ·
            2 days ago

            to be fair you need it for 24fps movies. however, on 144Hz monitors it’s entirely pointless indeed

            • Zozano@aussie.zone
              link
              fedilink
              English
              arrow-up
              3
              ·
              2 days ago

              My Dad showed me the Avatar game on PS4. The default settings have EXTREME motion blur, just by turning the camera; the world becomes a mess of indecipherable colors, it’s sickening.

              Turning it off changed the game completely.

      • Zozano@aussie.zone
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 days ago

        Depth of field is basically how your characters eyes are unfocused on everything they aren’t directly looking at.

        If there are two boxes, 20 meters apart, one of them will be blurry, while aiming at the other.

        • sp3ctr4l@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 days ago

          Your example is great at illustrating how DoF is often widely exaggerated in implementation, giving the player the experience of having very severe astigmatism, far beyond the real world DoF experienced by the average… eyeball haver.

      • JackbyDev@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Put your finger in front of your face. Focus on it. Background blurry? That’s depth of field. Now look at the background and notice your finger get blurry.