85
submitted 11 months ago by stopthatgirl7@kbin.social to c/gaming@beehaw.org

According to Bethesda Support, even the Intel Arc A770 GPU does not meet Starfield's PC minimum requirements.

all 25 comments
sorted by: hot top controversial new old
[-] MJBrune@beehaw.org 45 points 11 months ago* (last edited 11 months ago)

Yeah, that makes sense. They probably can't properly support a video card they couldn't get their hands on due to Intel not shipping it until late last year. They also aren't that powerful of cards. Lastly the Intel drivers are brand new. Most engines are not treated against them, as such there are a lot of corruption bugs. Which makes sense because they weren't able to get the cards early enough to support them. Since Intel has now discontinued their flagship arc card not even a year after release it's unlikely any games will really support Intel gpus in the future.

[-] ninjan@lemmy.mildgrim.com 63 points 11 months ago

That's a bit disingenuous. It's Intels own Limited Edition A770 SKU that is discontinued not the A770 as a model. They still ship the chip to AIB makers like ASRock etc. Their second generation, BattleMage, is still on track as well so on the contrary I believe we'll see much better support for Intel GPUs in the coming years since more game developers will have had adequate time with the hardware. Intels cards are also priced competitively if we're looking at the entry level cards which is bound to make them end up in many cheaper pre-builts that parents buy for their younger kids. So I expect to be quite commonly used for certain games in the coming years.

[-] k_rol@lemmy.ca 16 points 11 months ago

Thanks for correcting the disinformation.

[-] MJBrune@beehaw.org 2 points 11 months ago

The limited edition wasn't limited in the sense they planned to stop making them. It's their flagship. This is what I got off of a few articles. If they are still shipping chips to people, it wasn't clear from a few places I read this from. Additionally battlemage information seems to be all from leaks.

Either way with how shotty the drivers have been went how little hardware has been available to place blame at video game developers for not supporting their cards is silly.

[-] ninjan@lemmy.mildgrim.com 4 points 11 months ago

I'm placing 0 blame on developers here but it's just a fact that Intel can't reasonably optimize the drivers for all games past and present in such a short time. And developers haven't had access to the card for even remotely long enough for it to be part of the testing for any game (outside small titles maybe but they generally don't need special treatment driver wise) releasing this year or next. AMD and Nvidia have literal decades of head start. So while I would've wanted Intel to do a better job I'm not trivializing the monstrous task either, and all things considered they've done OK. Not great, not horrible.

If it wasn't clear in the articles you read then those places wanted the clicks and engagement that comes from vaguely implying that Intel is killing their GPU division.

Falsehood flies, and the Truth comes limping after it - Jonathan Swift

[-] dudewitbow@lemmy.ml 5 points 11 months ago

Its not like intel never had gpu drivers (they have had igpus for ever), they just never had to constantly need to update them for the gaming audience.

Lets not pretend features like intels quicksync that came out on sandy bridge igpus to do video encoding didnt reshape how companies did encoding for viewing(which would lead to NVenc or AMD VCE) or scrubbing in the case of professional use.

The gpu driver team had existed for awhile now, its just they never was seveeely pressured to update it specifically for gaming as theybreally didnt have anything remotely game ready till arguably tigerlake's igpu.

[-] lemann@lemmy.one 5 points 11 months ago

Since Intel has now discontinued their flagship arc card not even a year after release

Whaaaat? That's disappointing ☹️ I was hoping finally there'd be some more competition

[-] penquin@lemm.ee 44 points 11 months ago

They didn't discontinue their cards, only the limited edition one.

[-] lemann@lemmy.one 7 points 11 months ago

Ohh, thank you for the clarification!

[-] penquin@lemm.ee 2 points 11 months ago

You're welcome :) I'm actually going to buy the 770 by the end of this year. Heard it works great on Linux.

[-] TheOakTree@beehaw.org 1 points 11 months ago

For anyone still following this thread in confusion, the Limited Edition (LE) card is Intel's equivalent of a Founder's Edition card. Intel stopped producing LE cards, but their AIB partners are still producing their own SKUs.

[-] freeman@lemmy.pub 4 points 11 months ago* (last edited 11 months ago)

I saw a graph yesterday that put them squarely between the nvidia 4000 and the latest AMD gen in terms of performance. M

Edit: I have bad memory. Here’s the graph. https://cdn.mos.cms.futurecdn.net/QKdmNvH8KqrZmnnqRDiz6k-970-80.png.webp

[-] MJBrune@beehaw.org 3 points 11 months ago

Yeah that's saying their highest end card is the lowest end 4000 series card. Which the lowest end 4000 series card isn't great.

[-] SenorBolsa@beehaw.org 7 points 11 months ago* (last edited 11 months ago)

Yup, it's not in the specs that it supports intel graphics.

These days it's expected that any directX/Vulkan supporting card can run just about anything with varying levels of performance, back in the day it was very very specific what a 3d game engine supported. If your card wasn't on the list it wasn't going to run outside of software mode unless the newer version of that card had backwards compatibility features. Also later on you had to worry about very specific shader features and direct x features being supported to even get the game to look right.

Just a bit interesting how times change. They definitely should have worked with intel a bit to get it to work, at least given them a copy with some time for them to work out their drivers to support it.

[-] DaSaw@midwest.social 6 points 11 months ago

I don't know much about specs. I just find it fascinating that people are actually defending Bethesda in this post. Where's the standard anti-Bethesda fandumb pile on?

[-] lustyargonian@lemm.ee 2 points 11 months ago
[-] DaSaw@midwest.social 2 points 11 months ago

I've probably seen it here more than on Reddit, but that's because I spend more time in the general gaming community here, while on Reddit I was in the fan community specifically... particularly teslore, where "Duh, TES lore is stupid and random" doesn't get much traction.

[-] conciselyverbose@kbin.social 1 points 11 months ago

The problem is that no one actually really follows the specs (or the specs don't define everything). So you can't just build to the spec and have your game work. You have to know all the ways the different hardware manufacturers cheat and adapt your stuff to their drivers.

If intel still has issues in their drivers and implementation, developing to run correctly on their cards isn't trivial at all. It should still mostly work, but it's hard to catch every edge case without experience with how they do things.

[-] BaroqueInMind@kbin.social 1 points 11 months ago

What's the Intel equivalent to the nVidia 4080?

[-] exscape@kbin.social 15 points 11 months ago

How is that relevant? RTX 4080 is not a the minimum requirement for Starfield.

[-] theangriestbird@beehaw.org 12 points 11 months ago

there isn't one. At the moment, they aren't aiming that high. Their performance varies wildly from game to game, but at best, their most powerful card atm punches at about 3060 levels.

this post was submitted on 08 Sep 2023
85 points (98.9% liked)

Gaming

30344 readers
93 users here now

From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!

Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.

See also Gaming's sister community Tabletop Gaming.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS