lol. has anyone found ways to optimize starfield for their pc, like reducing stuttering, FPS drops, etc?

  • redfellow@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    3
    ·
    edit-2
    1 year ago

    They didn’t optimize it for consoles either. Series X has equivalent of 3060 RTX graphical grunt, yet it’s capped to 30fps and looks worse than most other AAA games that have variable framerates up to 120fps. Todd says they went for fidelity. Has he played any recent titles? The game looks like crap compared to many games from past few years, and requires more power.

    The real reason behind everything is the shit they call Creation Engine. An outdated hot mess of an engine that’s technically behind pretty much everything the competition is using. It’s beyond me why they’ve not scrapped it - it should have been done after FO4 already.

    • stigmata@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      1 year ago

      Weird how everyone jokes how shitty Bethesda developers are but everyone’s surprised how bad Starfield runs.

    • Huschke@programming.dev
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      4
      ·
      1 year ago

      And don’t forget the constant loading screens. A game that has so many of them shouldn’t look this bad and run this poorly.

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      9
      ·
      edit-2
      1 year ago

      Look, I agree with everything from the first paragraph, and the CE does seem to have a lot of technical debt that’s particularly shown in Starfield, which is trying to do something different than the other games. The engine being “old” is bad though (I know you didn’t make it, but it’s often said), and it being “technically behind” other engines is really true in all ways.

      The Creation Engine had been adapted by Bethesda to be very good at making Bethesda games. They know the tools and the quirks, and they can modify it to make it do what they want. It has been continuously added onto, just as Unreal Engine has been continuously added onto since 1995. The number after the engine name doesn’t mean anything besides where they decided to mark a major version change, which may or may not include refactoring and things like that. I have a guess that CE2 (Starfield’s engine) is only called CE2 because people on the internet keep saying the engine is old, but tell them to use UE (which is approximately the same age as Gamebryo) but adds numbers to the end.

      • MonkderZweite@feddit.ch
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        which is trying to do something different than the other games.

        The other games from Bethesda, right?

        • Cethin@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Yeah, I meant Bethesda games, but it’s different from what most games are trying to do. The exception being Elite Dangerous and Star Citizen (which the Kickstarter was more than 10 years ago at this point…).

    • PatFusty@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      12
      ·
      edit-2
      1 year ago

      Correct me if Im wrong but dont they limit frametimes so they can reduce tv stuttering? NTSC standard for TVs is 29.94 or 59.94 fps. I assume they chose the 30fps so it can be used more widely and if its scaled to 60 it would just increase frametime lag. Again, im not sure.

      Also, comparing CE2 to CE1 is like comparing UE5 to UE4. Also, i dont remember but doesnt starfield use the havok engine for animations?

      Edit: rather than downvote just tell me where I am wrong

      • Nerdulous@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Not to put too fine of a point in it but you’re wrong because your understanding of frame generation and displays is slightly flawed.

        Firstly most people’s displays, whether it be a TV or a monitor, are at least minimally capable of 60hz which it seems you correctly assumed. With that said most TVs and monitors aren’t capable of what’s called variable refresh rate. VRR allows the display to match however many frames your graphics card is able to put out instead of the graphics card having to match your display’s refresh rate. This eliminates screen tearing and allows you to get the best frame times at your disposal as the frame is generally created and then immediately displayed.

        The part you might be mistaken about from my understanding is the frame time lag. Frame time is an inverse of FPS. The more frames generated per second the less time in between the frames. Now under circumstances where there is no VRR and the frame rate does not align with a displays native rate there can be frame misalignment. This occurs when the monitor is expecting a frame that is not yet ready. It’ll use the previous frame or part of it until a new frame becomes available to be displayed. This can result in screen tearing or stuttering and yes in some cases this can add additional delay in between frames. In general though a >30 FPS framerate will feel smoother on a 60hz display than a locked 30 FPS because you’re guaranteed to have every frame displayed twice.

        • PatFusty@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          Thanks, i was recently reading about monitor interlacing and i must have jumbled it all up.

      • redfellow@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        1 year ago

        Todd said they capped it at 30 for fidelity (= high quality settings). Series X supports variable refresh rate if your TV can utilize it (no tearing). Series X chooses applicable refresh rate which you can also override. All TVs support 60, many 120, and VRR is gaining traction, too.

        Let’s take Remnant II, it has setting for quality (30) balanced (60) and uncapped - pick what you like.

        CE is still CE, all the same floaty npc, hitting through walls, bad utilisation of hardware have been there for ages. They can’t fix it, so it’s likely tech debt. They need to start fresh or jump to an already working modern engine.

      • HerrBeter@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        1 year ago

        That’s for movies, I don’t remember why, but films can be fine in 30fps. Games are kinda horrible at 30fps, all TVs I know have 60Hz or higher refresh rate for all PC signals

        • Malta Soron@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          iIRC it’s just because we’re used to the lower framerate in movies. If you look up some 60 FPS videos on YouTube you’ll notice how much smoother it looks.

          Personally, I’d wish sports broadcasts would be in 60 FPS by default. Often the action is so fast that 30 FPS just isn’t enough to capture it all.

          • Blackmist@feddit.uk
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            Higher framerates make things look more real.

            This is fine if what you’re looking at is real, like a football match, but what the likes of The Hobbit showed us, is that what you’re actually looking at Martin Freeman with rubber feet on. And that was just 48fps.

            24fps cinema hides all those sins. The budget of the effects department is already massive. It’s not ready to cover all the gaps left by higher framerates.

            Even in scenes with few effects the difference can be staggering. I saw a clip from some Will Smith war movie (Gemini Man, I think), and the 120fps mode makes the same scene look like a bunch of guys playing paintball at the local club.

        • frezik@midwest.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Movies have some blur in their frames. Lots of directors have justified this on the basis of looking more “dreamy”. No matter if you buy that or not, the effect tends to allow lower fps to look like smooth motion to our eyes.

          It also helps that they lock in that framerate for the whole movie. Your eyes get used to that. When games suddenly jump from 120fps down to 65fps, you can notice that as stutter. Past a certain point, consistency is better than going higher.

          Starfield on PC, btw, is a wildly inconsistent game, even on top tier hardware. Todd can go fuck himself.

        • 520@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          30fps in films looks okay because we’re used to that. Early Hollywood had to limit framerates because film wasn’t cheap.

          60fps is better for gaming because it allows the game to be more responsive to user input.