the Sim Settlements forums!

Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

CP77, TES 6, Starfield: New GPU or XBSX?

CerebralHawk

Active Member
Messages
172
Over the last week I've learned that my graphics card, a Radeon R9 280 (3GB, 950MHz), which was good enough to run Fallout 4 and Skyrim SE with no problems on high settings at 1080p, was only good enough to get about 15FPS tops in Cyberpunk 2077, and I worry that it will face a similar fate with Starfield, and possibly even worse with The Elder Scrolls VI. So, I'm faced with a couple options, and I'm wondering what you guys would do, or what you have if you were in the same boat.

Option one is the cheapest and easiest: buy an Xbox Series X. The obvious caveat being I'm limited to whatever Bethesda allows for modding on Bethesda.net and Microsoft allows on the new console. If it's changed from the previous generation's (Xbox One) limit of 2GB, I haven't heard anything about it. I'd like to think the XBSX can handle more, but as an XB1S owner, I have pretty low faith in Microsoft as a console maker, though it seems to be the more powerful of the two. All things being equal, I'd go with PS5, but with Microsoft having acquired Zenimax, they've also acquired my console loyalty along with it. Buying a console that isn't guaranteed to get Bethesda games is a complete deal-breaker, even though, all things being equal, I'd rate the PS5 higher (albeit having not used either). I just have more faith in Sony to deliver a good console than f—king Microsoft. As you can see, I'm not a big fan of Option One. That said, maybe Microsoft has learned from the PS4 completely curb-stomping the XB1 last generation and has made a better (X)box this time around.

Option two, buy a new GPU. I have NO idea where to begin here. I don't know how much RAM the graphics card needs. I don't know if I should stick with AMD or go to Nvidia. I don't have a preference. With Nvidia, I know the latest is the 3080, but are the 3070 and 3060 also good? Will a 2080 or 1080 be good enough? What about processors? Fans? I'm just kind of lost with graphics cards. On top of that, since Bitcoin is on the rise, graphics cards are next to impossible to get at market value — bots are buying them and flipping them for nearly double the price. So I hear. If I'm willing to pay $500 for an XBSX (and let's be honest, it's not $500, because you then have to pay more for games, you're probably gonna get GamePass at $15 a month (or at least, I'm strongly considering it), there's the warranty... I'm at least forking over $600 out of my checking account for the damn thing. So, my GPU budget should be at least that, or perhaps a little more considering I'm getting modding in my games as well. Though, the question must be asked: do I even need to spend that much? My last GPU was around $320 and was considered upper-mid-range at the time I bought it.

Of course, there is a third option. Google Stadia allows one to play games remotely, eliminating the need for paying for hardware in the first place. But, the fact that Google has completely avoided every question about what happens to game purchases when they inevitably close the service in a few years, makes it a non-starter as well, at least until they do. Especially considering games cost the same on Stadia as they do elsewhere. I mean, sure, if I pay $60 for Cyberpunk 2077 on Stadia, and I play it for six months to a year or however long, beat it six ways to Sunday, I will surely have gotten my money's worth, and I shouldn't b—h too loudly when Stadia goes under and Google shows me deuces when I go to ask how to get my game code to redeem on Steam, or better yet GOG... but still, the thought of paying full price for a game and losing it after an indeterminate amount of time just doesn't sit well with me, even though I never paid a dime for the hardware it runs on. Also, games don't really go on sale on Stadia, though a $10/month subscription (of which there is a free month included) gets you 10% off purchases. That knocks Cyberpunk 2077 down to $54, and it works with existing controllers via the Chrome browser. Big caveat here: no modding whatsoever. So while this will be fine for a game like Cyberpunk 2077, it will be the worst of the three options for The Elder Scrolls VI and possibly Starfield as well.

So... WWYD?

EDIT: I looked into two other streaming options. GeForce Now is interesting in that, for free, you can connect your Steam account to it, and you can play the games you already own. Well, some of them — and Cyberpunk 2077 is included on GOG, Steam, and Epic. I tried two games that were not installed on my PC: Deus Ex: Mankind Divided and Saints Row: The Third, and both just picked up and played like they were installed. However, the free GeForce Now is capped at 720p30. It still looked great, just a little jagged around the edges as it was a lower resolution than my monitor would have liked. Also, playing on the free tier, you're limited to 1 hour at a time, though they never say when your time resets. While playing, a GeForce banner slid down from the top, showing me my remaining time. It didn't get in the way at all, and it only stayed for a few seconds. Despite streaming only on a 25Mbit down, 1.8Mbit up DSL connection, I experienced no perceivable latency. A lot of these services are saying you need at least 10Mbps, but GeForce seems to want more. Oh, saves sync — in both cases, I had saves already, and by signing into Steam, it synced my saves. Also, it didn't care that I don't have an Nvidia card installed, which is kind of odd. I have one in my laptop (and maybe that counts, I made the Nvidia account on the laptop, so Nvidia knows I have a 940MX GPU, maybe that's good enough), but the desktop, again, Radeon R9 280. Lastly, GeForce Now also appears to be the cheapest streaming service, at both free for 1 hour at a time and 720p30, and $24.95 for six months as an introductory "founders" rate.

The other option is, of course, Microsoft xCloud, which is part of Xbox Game Pass Ultimate. $15 a month gets you access to all their games, but the selection is hugely limited and does not include Cyberpunk. The neat thing about Game Pass is, you don't seem to need an Xbox to use it. I just need the Xbox for games that are not on Game Pass, which... kind of defeats the purpose.

So, a winner sort of emerges, GeForce Now for free or about $4 a month, and I get to keep the game.
 
Last edited:
Staying on PC if possible is ideal, though TES6 is probably far enough away that current high end GPUs might be mid or low end. They should still probably be able to run, just not on highest settings/4K. CPU might also be a factor, might need to upgrade it if on an old one. I have i7-7700 and I would say it's just enough, though I don't expect it to remain viable for long so I'm planning to get a Ryzen 5800x or something similar as soon as I can. I sometimes get frametime issues/stuttering but most of the time it's fine. SSDs are pretty much mandatory now as well since now that XBSX & PS5 have SSDs games can be built to take advantage of high read speeds and they won't lose much of the audience if they no longer accommodate slow HDDs.

With Nvidia, I know the latest is the 3080, but are the 3070 and 3060 also good?
3070 or 3060 should do pretty well for now and for a decent while at 1080p or 1440p. Generally it seems like doing an upgrade every other GPU generation seems to be about right for most people. High end enthusiasts tend to upgrade every generation, and budget GPUs tend to hold up poorly so people who buy those can end up needing to upgrade on successive generations in order to keep up.

AMD Radeon is improving but I would probably still give the edge to Nvidia.

Of course, there is a third option. Google Stadia allows one to play games remotely, eliminating the need for paying for hardware in the first place. But, the fact that Google has completely avoided every question about what happens to game purchases when they inevitably close the service in a few years, makes it a non-starter as well, at least until they do. Especially considering games cost the same on Stadia as they do elsewhere
Games will be lost at that time when they close the service & the Stadia device will be rendered useless, it's all effectively an expensive rental service. I would not recommend Stadia unless they make things significantly cheaper and you're on fiber internet. Xbox GamePass & GeForce Now seem vastly superior to it.
 
GeForce Now seemed like a dream come true, until I hit up their forums, which are filled with people who paid who still had to wait in line, and people who paid who still found their games running at 720p and/or on Low settings. Fortunately, GeForce Now does have a viable free tier, and it lets me play games on my laptop it wouldn't be able to otherwise. Though the $25/6 month price is a special introductory "founders" price that is subject to go up, $50 a year is very reasonable to play games on hardware not otherwise capable of it. And of course you can play on phones... as long as said phone runs Android. I'm kind of a privacy guy, so, mine doesn't, but some of the others (Stadia and xCloud, notably) have gotten around Apple's draconian restrictions by offering Safari web apps — a workaround Apple has even indirectly advocated. But, Nvidia has announced no such plans. Not that I'm itching to play any game on a 4.7" screen (2020 iPhone SE), even in the bathroom. I think even if I had the iPhone Pro Max with its 6.7" screen, it would be a stretch for a lot of games. And, of course, Nvidia's complete and seemingly intentional avoidance of any Bethesda game makes it challenging to support — could be a Microsoft thing, but I do believe GeForce Now supports The Outer Worlds, which is Microsoft (Obsidian) as well.

As for Stadia, it seems they (quietly?) offered a deal where, if you bought Cyberpunk 2077 from them, they'd give you the Stadia controller (which is ugly AF, but allegedly comfortable to use) and a Chromecast Ultra, together a $100 value, but the deal expired. I'd have been tempted if they'd offered me the deal, but knowing the deal existed for others kind of sours the current deal ($60 to rent the game for an indeterminate amount of time). Honestly I think if you buy any $60 game on Stadia, they ought to throw in some hardware, so you have something to show for your purchase when the service goes under (since the controller will become a Bluetooth controller, and the Chromecast Ultra has other uses). Though, again, privacy guy — I really don't like Google all that much. I respect that they have the infrastructure to roll something like Stadia out, but I also know they have commitment issues with services, and users of said services are often disappointed in what Google replaces it with (see Google Play Music → YouTube Music, and Google Inbox → Gmail (Gmail came first, but Inbox's unique features were never added to Gmail after the former closed)). Also, Google doesn't like my ISP, and has threatened to throttle YouTube if I don't switch. Apparently some ISPs host mirrors of YouTube, which makes the experience better for everyone on the network, and my ISP is not one of them, but their biggest competitor is. So, it's not some evil plot, it actually makes sense, but I also like my ISP's neutral stance. The onus is on Google, not my ISP, to provide YouTube service. So I don't expect other Google services to work as well as they could.

Side note: Thanks for Superstructures — looking at the screenshots, it seems like some of them might be out of place, but that has never been my experience in practice. It does highlight a fundamental issue I have with Sim Settlements as a whole, the randomness of what is built — I know you can change the plot after, but being able to choose before, while antithetical to the spirit of the mod, would help with planning. I mean, maybe the settler is using the ASAM to build what they "want," but, maybe I myself (as the Sole Survivor) am using the ASAM to build what I want for the settlement. Though this would add an extra, perhaps popularly unwanted layer of nuance to the mod's workflow. That is, I set up zones where I want X plot in Y space, but it doesn't actually get built, until a settler comes along and says "I need a home," and then they can "pick" from available "zones" and then what I want gets built. That way, I choose what gets built and where, and the settler "chooses" which one they want. We both "win." Anyway, for the random factor that Sim Settlements offers, Superstructures certainly makes things more interesting!
 
If you haven’t bought anything yet my suggestion is to wait a little while as nvidia has the new 30 series stuff changing around a bit in the mid end think I herd of a 3060 ish card with like 12gigs vram and that would be a good mid range setup basically the more vram the better within reason of course. I bought the rtx3090 and with its 24gigs vram it’s a waste even on 4K games that only require 10gigs maybe 12 max but I had a decent last few months at work end of last year so it’s all good.

also amd has their cards sporting 16gigs and costing like 500 or less depending on the model but the 5800 if you can get one at normal price is a good 4K no ray tracing card. A 3070ti may be similar cost as well and that a nice setup considdering it’s better the the previous 2080ti. Guess it all comes to budget and needs or wants
 
No, I haven't bought anything yet. I guess my budget is about $500 for a graphics card, considering that's what a Series X would cost. I'm also a bit leery about continuing to upgrade this computer. What I want to do is take it apart, piece by piece, and give it a thorough cleaning, it's got to be nasty in there, but I'm lazy and "it works right now, so why bother?". I know there's no dead mouse inside my computer, but probably a ton of dust bunnies that aren't really hurting anything but my OCD. Also, I've had two fans go out and I never replaced them. And they don't even really cost that much, again laziness and complacency. I'm... not a good PC builder. I'm just alright at it. I'm not putting myself down in any way in saying this, or voicing a lack of confidence, but I would be better as a Mac user with an Xbox or PlayStation for gaming... just on account of preferring convenience, at this point, to customization. But... Bethesda game mods are addicting, and I'm not sure I can set that aside and just enjoy the vanilla game.

It doesn't help that my wife isn't really into gaming. She never really was, aside from Pokemon and the occasional Zelda game (Disney/Nintendo 80s kid). She got an Xbox 360 and Xbox One just for the Rockband games (worth noting: Rockband 4 had a Fallout 4 Vault suit as free DLC at launch, not sure if it's still available, so your band members could be wearing Vault suits... not sure if they had Pip-boys or not) and may have played a bit of a couple other games, but mostly the simple games, like Hexic (free puzzle game on the 360). She has a 3DS and got more usage out of that. So anyway, she's not driving any gaming purchases. She's somewhat behind the Series X option because her brother has one and they could, in theory, play games together, but in practice that's never happened. He plays the same games I do, though far more casually (beat them once then drop them) and he's more into the competitive online shooters I have no interest in. So he'll get The Elder Scrolls VI for sure, but since that will most likely be single-player only, it won't matter if he and I play it on the same console or not. We might play Fallout 76 together, if he owned it, and if I didn't think it was a total piece of s–t (I don't own it either).

Anyway, I've pretty much dropped Cyberpunk 2077 as it's completely unplayable at this stage, on my hardware. I'm sure I'll pick it up again in the future. Hell, I owned Half-Life 2 for about 4 years (?) before I could play it. Bought a CPU+Mobo combo on Newegg, back when that was a thing (a good thing, I mean), and it came with a Steam code for Half-Life 2. I redeemed it, and was one of the first Steam account holders (signed up in the first year), but I could not get the game to run, until I built my next PC, though I think by then I'd bought The Orange Box on Xbox 360... it was $20. Back then, I was a huge PC gaming snob (before "PC Master Race" was a thing), and likened the two analog sticks on modern console controllers to Etch-a-Sketch knobs. Portal taught me how to use them, and I've been a gamepad player since. That one level where you keep going up, near the end of the test chambers.
 
So on the nvidia side rtx 3060 12 gigs vram $320 bucks and is 1.3 times better then ps5/xbsx that’s probably one of the best deals coming out next month.

as for the rest of your pc I’d say a nice amd 5000 series maybe the 5600 or even the 5800 is budget allows is a good new option as Intel has nothing at this moment and probably nothing to compete till next year at the earliest.

That a good combo of stuff that will not break the bank and get you high enough fps in modern games to last a long while. this is all my personal opinion though so pick and choose anything that fits your needs. As for a ps5 or Xbox pretty clear Xbox is my choice since a lot of future Bethesda games will be Xbox exclusives but yeah the limited mod download size is depressing on consoles.
 
$400 and 8GB according to the site, but still, not bad... what I have now has 3GB, and is a lot less powerful. And yeah, I want Nvidia for my next video card.

I actually don't know what specs the XBSX and PS5 have, only that they've ditched hard drives for SSDs, and they have Blu-ray players (like the past generation of Xbox and the last two of PlayStation).

as for the rest of your pc I’d say a nice amd 5000 series maybe the 5600 or even the 5800
AMD Threadripper CPU? Nah, my Xeon should still be fine — switching processors would mean switching motherboards as well. Linked in case you want to have a look, price is about what I paid as well. I bought it based on the argument that a Xeon is basically an i7 minus the onboard GPU, which I didn't need as I was buying a video card.

the limited mod download size is depressing on consoles.
I wonder if they'll raise it for the XBSX and The Elder Scrolls VI. Either way, the point is that we're at their mercy. It's more likely that they will raise the limit than that they will eliminate the limit altogether, but it's probably most likely that they will keep it at 2GB. I wonder if PS5 will allow modding at all. Sony promised it for the PS4, reneged, and then compromised due to fan backlash, but their compromise (900MB and no external assets) was not good, though we did see PS4 gamers (like YouTuber norespawns) make the most of it.
 
Not the 3060 ti just the regular old rtx 3060 has 12 gigs vram and is basically a 2080Ti ish performance to me at 320 maybe 350 is crazy but supposedly it’s 1.3 times better then the new consoles as far as power.
And the 5000 is just regular desktop cpu not threadripper but as long as what you got is cool then go gpu and a nice nvme if supported harddive or a nice regular ssd and that should be a good improvement
 
My SSD isn't NVMe, that was cost-prohibitive at the time. At the time, I spent about $320 on a 1TB SSD — I had a 750GB hard drive, and the thought of having less worried me, so I went up instead. NVMe would have been about $700–750 then. Now, the same SSD (WD Blue, nothing fancy) is like $85–115 (prices fluctuate), and the NVMe 1TB SSDs are around a much more accessible $200–250.

NVMe might be twice as fast as regular, or close to it, but it's like deciding between 1080p (Blu-ray) and 4K when you've been living at 480p (DVD) for years. Yeah, one is clearly better, but both are gonna rock your world, and if one is much cheaper (like regular SSDs were, but unlike regular Blu-rays vs the 4K variants), you might as well just grab that.

Fortunately, I think Cyberpunk 2077 is the only game I really care about that my system can't currently run, so I have time — to wait for GPU prices to normalize, and to wait for CDPR to fix their game.
 
Yes I’m really hoping they do some serious bug fixes and game changes it’s a decent game on pc at the moment but nothing great.
 
Overall I would suggest to prioritize a GPU upgrade to RTX 3060 12GB and plan for an eventual CPU upgrade sometime between a year from now to several years. I would stick with the SATA 1TB SSD as main gaming storage.

Nah, my Xeon should still be fine — switching processors would mean switching motherboards as well. Linked in case you want to have a look, price is about what I paid as well. I bought it based on the argument that a Xeon is basically an i7 minus the onboard GPU, which I didn't need as I was buying a video card.
That appears to be about par with my i7 7700, which I'm looking to replace when I can find a Ryzen 5800x or 5900x. I hope to replace the i7 within a year.
I wonder if they'll raise it for the XBSX and The Elder Scrolls VI. Either way, the point is that we're at their mercy. It's more likely that they will raise the limit than that they will eliminate the limit altogether, but it's probably most likely that they will keep it at 2GB. I wonder if PS5 will allow modding at all. Sony promised it for the PS4, reneged, and then compromised due to fan backlash, but their compromise (900MB and no external assets) was not good, though we did see PS4 gamers (like YouTuber norespawns) make the most of it.
Skyrim SE had a 5GB limit on XB1 so at the very least The Elder Scrolls VI should have that much on XBSX. It'll be well into the next generation so 4096x4096, 8192x8192, or even larger will likely be standard for texture sizes and there could be more texture map files by then(I believe many games use more types than the 3 we have on Fallout 4). On PS5 who knows, unlikely but perhaps they might have to. I think they might end up at a disadvantage compared to Xbox in terms of exclusives. The game won't be on PS4 or XB1 and by the time it releases even budget systems should be at least around par with the new consoles. With SSDs quite a lot more can be feasibly added before any significant performance loss due to storage read so it becomes more about good modding practices & conflict management than worrying about install size. I have a pretty good NVMe and from the looks of things XBSX has one that is about par or potentially better. TESVI is probably around 2025 or potentially later, by then most PCs will probably come with SATA SSDs.
 
[Your Xeon] appears to be about par with my i7 7700, which I'm looking to replace when I can find a Ryzen 5800x or 5900x. I hope to replace the i7 within a year.
Intel to AMD, how do you feel about that? I've never been deep into the rivalry, but growing up in the 80s, I always had this perception that Intel was superior and AMD was the knock-off. My first PC used an Athlon64, and it was okay, but not great. For my second PC, which was meant to be a "cheap" PC for my wife while I built a gaming rig later, I used an AthlonFX (I think) and I found that lacking. Now AMD seems to be on top with the Threadripper stuff and I don't know what to think. Obviously they're not completely incompetent, but it's gonna take some re-education for me to see them as equal to Intel, let alone superior. Though, like many rivalries, it goes back and forth. (Except cola, those aren't changing recipes, New Coke aside... and as for that, I'm the odd duck who likes both Coke and Pepsi... and for different reasons/uses.) I'm willing to keep an open mind, though.

Anyway, yeah, I can't remember if my Xeon is 6th or 7th generation Intel. One of the two.

by [2025] most PCs will probably come with SATA SSDs.
I was just reading a thread on Reddit (I don't have an account, I just read the front page) where they were asking for the best thing Redditors have bought for under $50, and SSDs came up. While it's kind of disingenuous to put SSDs in the "$50 or under" category, technically you can probably get a 256GB one for somewhere in that range, but you probably shouldn't. Though, for most users who just do social networking, it's more than enough for an OS, a web browser, and some other miscellaneous tools.

I've always said that the best upgrade you can give any computer is a new monitor, and this was especially true when widescreen monitors first came out, and got cheap. I got my 23" HP monitor for about $175, and while it's only 1080p, it's a good screen, no dead pixels, looks great, I just wish the power button weren't so mushy (HP 2311x). I still think a monitor upgrade can be a good boost to a system, but it doesn't impact performance at all. An SSD, the comment thread argued, is the cheapest and most effective way to increase performance on a lot of older machines where disk read/write is the bottleneck. On a desktop model, they're super easy to install. "Worst" case scenario, yank the cables from the back of the hard drive, plug them into the [SATA] SSD, close the chassis, then power it up and install Windows. You don't have to screw the SSD into the chassis, as it has no moving parts, it's just a good idea for cable and air management, and it looks better. Also, might be a good idea to leave the old hard drive connected, so you can transfer stuff off of it, then repurpose it for media storage. But if you only have one SATA cable, or you don't know squat about computer building/repair, it's such an easy fix that does so much.

At this point, they should just stop making hard drives for system drives. I'm fine with the big ones, but 1TB, even 2TB hard drives need to go away and stay there. Systems should be shipping with an SSD for Windows and apps, and optionally a hard drive for media. Optionally, because the WD Passport, external bus-powered drives, are so good and so cheap. You can get the 2TB WD Passport for $65 in black or red on Amazon. A little cheaper than I thought. That uses USB 3 for data and power at the same time. I used to have the 500GB model, way back when — I now have a 500GB Samsung T3 portable SSD, and I love it. Paid about $200 for it, now the newer T5 model is under $100. Think of it as a flash drive with a cable rather than a plug, it's still capped by the bus, but also way faster than any computer's internal hard drive. On a computer with an old-school hard drive, it's faster and more efficient to run PortableApps off the portable SSD than off the hard drive. Not to mention being able to take those apps from PC to PC and maintaining settings and such. (Just, lock it down somehow in case it gets stolen, don't want the thief getting access to your banking information and whatnot.)

Anyway, to get back on topic:

Skyrim SE had a 5GB limit on XB1
Is that so? Does Fallout 4 still have 2GB, then, or was it raised?
 
Intel to AMD, how do you feel about that? I've never been deep into the rivalry, but growing up in the 80s, I always had this perception that Intel was superior and AMD was the knock-off. My first PC used an Athlon64, and it was okay, but not great.
My first PC was an Apple ][ lol. Then my aunt gave me an 8088. Learning DOS was a PITA. Then later, I got a job, some $$$ and bought a 486... IIRC, the Athlon was vastly superior to the Pentium 4 at the time. It seems that since the second gen i5 there has been no real ipc (instructions per clock) improvement. As the semiconductor processes approach the atomic level, quantum effects come into play and cause problems... So, they (CPU manufacturers) have been moving toward adding more multi-threading capabilities to CPUs. To me, this has no impact on us gamers until a game comes out that can utilize these features. If you have professional apps that can utilize these resources, then it is worth the investment.

That said, I have an i5 6600k and a GTX 1070. I have no future plans on upgrading unless there is a game released (one I actually want to play) that requires me to do so. If your current system gets the job done, I would leave upgrades to a future you that needs to worry about it! I also am quite lazy and try to keep my upgrades to a minimum. It has the side effect of being easier on the wallet. ;)

Edit- I wonder how many dust bunnies are camping out inside my computers?
 
Ipc isn’t the big issue though cause they have had roughly 8-10% each model change the big issue is the frequency being stagnant. 5.0 ghz has been hanging around for the last 10 years of processors. They are just now getting past 5ghz on the turbo clocks 10 years and little progress it should be around 7.5-10ghz by now but as you said the setbacks of decreasing the process sizes has really slowed the development of the ipc and frequency improvements.

plus Intel is even worse the last 3 to 5 years they can’t keep up with tsmc and their die shrinks tsmc is down to around 5 nm on their upcoming fabs and I tell is struggling to get their 10nm stuff up and running. Now Intels 10nm and tsmc 7nm are basically the same size as far as the dies. But intel is falling farther and farther behind the curve , not to mention their 4th ceo in the last few years just quit as well making it hard to believe they are in a good spot at the moment. I’d say they are around 3 years behind amd and tsmc offerings.

basically amd has the better gear for processors at the moment and nvidia pretty much always has the better gpu gear but amd has some good stuff right now to except their stuff can’t come close in ray tracing or dlss aka ai upsampling. I’d guess their next offerings in a couple years may pretty much be on par with nvidia.

starfield pretty much is gonna need to be running a 2080ti equivalent or better if you want anything better then 1080p low settings for 1440p probably a 3070 and 4K gonna be 3080 class gear. Let’s hope Bethesda got their optimization game down better since Fo4.
 
Let’s hope Bethesda got their optimization game down better since Fo4.
This x100. Too much of the game is tied to just one CPU thread resulting in bottlenecking the game, and single threaded IPC isn't likely to massively improve on CPUs anytime soon. Efficiency improvements and making better use of what's there for hardware are what's needed. The game can crash if there are too many draw calls at once or too sudden of an increase in draw calls. Shadows should become GPU based. Papyrus needs a massive increase in resources. AI improvements, better NPC behavior with more NPCs and less performance cost. Level/world design will become far easier for Bethesda with much better results if they can take full advantage of 8 core / 16 threads, include a better Precombine/PreVis system or replace that system outright. DLSS & if they could have an equivalent tech to Nanite/Lumen from UE5 could help a ton, and could significantly raise the upper limits for modders. This would also hugely support BGS with what's possibly one of their best strengths, designing the world. Taking engine optimization seriously could help enable things to Just Work for their game design. It could also help avert potentially having to cut features or overhaul content for performance reasons which wastes significant money spent on developer hours and can't be good for morale nor for the reception of the game when expected or promised features aren't there. I believe this sort of stuff is what's needed in order to best support a hypothetical Sim Settlements for Elder Scrolls 6.

Even if it takes longer for TES6 to release it'll absolutely be worth it. It's probably a great thing that Todd didn't give any estimates for the release date as that could put them in a situation of feeling forced to release the game before it's ready, if they they might be afraid of potential backlash due to delays. The Elder Scrolls games release so infrequently that it'd be an absolute shame if the game is bad. So I think we'll just have to wait, no matter how long it takes.

So, they (CPU manufacturers) have been moving toward adding more multi-threading capabilities to CPUs. To me, this has no impact on us gamers until a game comes out that can utilize these features. If you have professional apps that can utilize these resources, then it is worth the investment.
We are probably approaching the point where that will start to happen. There are already some games that run better on higher core counts. I expect it to happen a lot more after games are no longer released for PS4/XB1. After that point too, they can also leverage SSD read speeds to quickly read large amounts of data from storage on demand all without long load times. In a few years there will be DDR5 RAM with potentially much higher speeds and higher capacity, which can further help with this. They're hitting upper limits in terms of single threaded capabilities, games won't otherwise be able to advance much and graphics are generally good enough that it's not a selling point by itself anymore. If they don't offer anything new or notably improved in terms of a gameplay experience we can just keep playing Fallout 4/Skyrim and others can keep playing existing stuff. Then there's less reason for people to want to buy new PC hardware or XBSX or PS5.

Intel to AMD, how do you feel about that? I've never been deep into the rivalry, but growing up in the 80s, I always had this perception that Intel was superior and AMD was the knock-off.
It's been that way for many years but Intel have seemed to lose their way, at this point Intel seem to be the knock-off. They now have a new CEO so maybe he can turn things around but it'll probably take quite a few years. I wish to go AMD from now on unless Intel make a big comeback at some point. I will only consider getting an Intel CPU next up if it becomes clear there's no way I'm going to find a 5000 series Ryzen even if I wait until my current CPU bottlenecks games to less than 40-50fps at 4K. I'd be fine to just wait for Zen4 if need be but it'll be on a new AM5 socket which will be DDR5 only and typically new RAM formats will be vastly more expensive for a while after the first generation's initial release.

Is that so? Does Fallout 4 still have 2GB, then, or was it raised?
Nope, it remains 2GB for XB1 as well as no more than 1GB for a single file(no idea for PS4 probably still 900mb) and unlikely to ever change unless they do a full re-release for Series X/PC like a Fallout 4 Special Edition or an "RTX Edition". I kind of hope they will, especially if Starfield ends up not really being a game for TES/Fallout fans, although there'd be concerns of splitting the community if we have FO4 & also FO4 Special Edition. I guess the only thing that'd make sense is having it as a free upgrade on PC for FO4 + Season Pass owners to encourage migration, at that point although it might not make enough money to justify much of an upgrade. They might pass on that altogether too since the game can be played via backwards compatibility.
 
I also am hoping for a fo4 rerelease into a better or more current hardware useful release. But idk if Bethesda care enough about fallout and it’s ip like they do elder scrolls. I wouldn’t mind seeing a Skyrim released for the new systems also.

yeah as far the the pre combine pre vis system it needs to go. I can understand combining stuff like background environment or stuff like set pieces but pre combines on everything ruined fo4 for modders who learned on Skyrim and it’s easy to use setups.

basically until the whole engine is rewrote it won’t ever function to its fullest. Although I guess from recent rumors they have been doing exactly that they. There have been rumors of them bringing in outside companies to overhaul the engine to be able to run starfield and it’s vastly more demanding requirements.
 
and nvidia pretty much always has the better gpu gear
There was a time beck in the stone age when ATI was a thing. nVidia and ATI were playing leap frog for the #1 spot... :nerd:
Ipc isn’t the big issue though cause they have had roughly 8-10% each model change the big issue is the frequency being stagnant.
Typically in the past IPC increased naturally with a die shrink. As physics dictates that die shrinks become more difficult, as we have seen, IPC gains are small. I agree that a boost in frequency would be a boon. However, as the process gets smaller, the max frequency is limited due to signal bleed. I can spare you further techo mumbo jumbo as I have forgotten most of the techno terms...
I'd be fine to just wait for Zen4 if need be but it'll be on a new AM5 socket which will be DDR5 only and typically new RAM formats will be vastly more expensive for a while after the first generation's initial release.
This is a perfect example as to why I'll wait until a game requires me to upgrade. I mean even my 10 year old AMD laptop is good enough to surf the net... God knows, by the time TES 6 drops, I'll need an AMD 15500 and a nVidea RTX 6080. :lol The future me will have to worry about that...

I do hope Starfield is something I'll be interested in playing. The games released in the last 4 years or so do not seem that interesting to me. I mean, a couple of years ago, I resorted to buying a Neptunia collection that was on sale at Steam to get my RPG fix... :confused: That should say something... The sad part was it was fun and I enjoyed it... If you take out the zany characters and JRPG style battles, it had a Fallout 2 feel with the jokes, references and 4th wall breaking. Wasteland 2 was good to about the half way mark, then it seemed to be very sameish. I have yet to complete it. Pathfinder: Kingmaker was pretty decent and kinda Bethish with the bugs. It also had a pseudo settlement system that could waste your hours. I think they could have done away with the time advancing mechanic and just put a button in the menu called "Let's get on with it!"

Maybe I'm just looking back with nostalgia glasses. I hope the future brings us something good instead of rehashed shooters ported to PC from console...
 
I also am hoping for a fo4 rerelease into a better or more current hardware useful release. But idk if Bethesda care enough about fallout and it’s ip like they do elder scrolls. I wouldn’t mind seeing a Skyrim released for the new systems also.

land yeah as far the the pre combine pre vis system it needs to go. I can understand combining stuff like background environment or stuff like set pieces but pre combines on everything ruined fo4 for modders who learned on Skyrim and it’s easy to use setups.

basically until the whole engine is rewrote it won’t ever function to its fullest. Although I guess from recent rumors they have been doing exactly that they even have been rumors of them bringing in outside companies to overhaul the engine to be able to run starfield and it’s vastly more demanding requirements.
Hopefully the rumors = reality! :)
 
There was a time beck in the stone age when ATI was a thing. nVidia and ATI were playing leap frog for the #1 spot...
ATI got bought by AMD, and nVidia became Nvidia. I always thought people writing "Nvidia" were just being lazy but apparently the official spelling (capitalization) changed.

One thing I remember about ATI cards was the ability to have a desktop much larger than you could display, and you'd only see part of it (what you could actually display), and you could push the screen to see the rest. It sounded cooler than it was, and I don't think the feature exists in the current Radeon software.

Wasteland 2 was good to about the half way mark, then it seemed to be very sameish.
I wanted to like Wasteland 2, but about an hour in, the difficulty ramped up fast. The first part is just really easy, but then I felt shoehorned into entering a difficult area I had no hope of beating, and losing didn't progress the story. Eventually I gave up. While the random battles were actually cool as hell, I've never been a fan of turn-based combat. I played the original Fallout when it was new, but I didn't get far, and I've never beaten it. Got even less far into Fallout 2. My real entry into the series was Fallout 3.

Pathfinder: Kingmaker was pretty decent and kinda Bethish with the bugs. It also had a pseudo settlement system that could waste your hours. I think they could have done away with the time advancing mechanic and just put a button in the menu called "Let's get on with it!"
Went to ask a stupid question... got Pathfinder confused with something else. I can't think of what right now. Obsidian's game series, turn-based like Wasteland 2. So, Pathfinder. The D&D 3.5 knockoff. I got all the source books in a Humble bundle but never really looked through them. I'll have to look into that one, though I never could get into the real D&D games, like Baldur's Gate. And I know Pathfinder (and D&D 3.5) is a lot of numbers, not sure how that translates to gameplay.

...Path of Exile... I think that's what I thought you meant at first. Anyway... some of the technical stuff (die sizes, IPC) is a bit over my head, but good to read the back-and-forth just the same.
 
I always thought people writing "Nvidia" were just being lazy but apparently the official spelling (capitalization) changed.
O.O!! I didn't know that... Maybe I should put TechpowerUp back in the rotation... I quit paying attention as it seemed the only things being reported were the latest bling gaming accessories...
I wanted to like Wasteland 2, but about an hour in, the difficulty ramped up fast.
What I did was trigger random battles until I was man enough to proceed. It was a slog. It is another game where all the effort went into the beginning and became more plain (stale?) as time went on. Towards the end where I lost interest, it seemed to degenerate into basic fetch quests with no real story. You also had to be careful when exploring the world map as they used deadly radiation to gate where the player had access to. I fell victim to that a few times just randomly clicking on a spot to see what was where.

To me, the first 2 Fallouts weren't great for the combat, (though that was pretty deep as well) it was the dialog. Being lippy to the wrong person would get you dead and asking a seemingly insignificant question would make you king of the world. Dialog choices were meaningful and had consequences. Something that has almost disappeared from games released in the last 5 years or so. To me Fallout 3 was 80% better than Fallout 4 in this regard. Its most likely why my nostalgia glasses hold Fallout 3 as the best game Beth ever released.

I don't know if I would call Pathfinder a D&D knock-off. I'm sure it was the main inspiration, but there is a lot in the ruleset that is different. Sadly, I can't remember the details. (CRS sucks)
And I know Pathfinder (and D&D 3.5) is a lot of numbers, not sure how that translates to gameplay.
The beauty of putting these tRPGs on PC is you do not need a Dump Truck worth of different # sided dice! PCs are good at math. Just not so good with RNGs. The rest boils down to how good the game maker is at adding the text to go with the numbers. And the systems that make use of them. To me Pathfinder was great minus the time mechanic that was used. It had a bunch of bugs at release and I do not know if they ever fixed them all. IIRC v2 was released.
.Path of Exile.
Great. Now I have to burn my geek card and remove all associated credentials... I did not know about this. I wonder how many hours I would have sunk into it?
some of the technical stuff (die sizes, IPC) is a bit over my head
Die size is technical but can be reduced to thinking in really small mm numbers! IPC, instructions per clock. i.e. how many equations can you solve per 1hz. So, if you have a 1Ghz processor and it can process 100 IPC, it solves 100,000,000,000 instructions (basically equations) per second. The preceding is a fictitious example and is in no way related to any physical property <-- Just in case! :lol

In my previous post I forgot The Outer Worlds . There is another game that had a great beginning sameish middle and were out to lunch for the ending. They either ran out of time, interest or money. <--pick 3... It is also a game that did not require me to upgrade any hardware. It seems the game companies are pouring their resources into shooters. In all honesty, I would not have upgraded my GTX 970 for a GTX 1070 if I did not buy a new monitor with 2560 × 1440 resolution and the 970 wouldn't cut it.
 
Top