If 90% of gamers can’t play a game, is it still worth releasing it like that?
I doubt 90%of players run the newest games at 4K/high
It’s not the resolution:
Even with AMD FSR 1.0 at 50% resolution scale, the game cannot come close to 30fps.
Dude if someone is spending 1.8k on just a fucking CPU and GPU together (this doesn’t include the cost of the motherboard, ram, storage, case, monitor, or mouse) I would fucking hope I can run my new game release at fucking 60fps 4k (minimum) natively.
Game dev companies got lazy. Instead of DLSS and FSR being really great tools for older GPUs to run newer games, it became a crutch for brand new $900 GPUs to run newer games.
Don’t get me wrong, DLSS and FSR are awesome and I use them to get games to run well at 4K with my 3070 Ti, it’s just a shame so many devs are abusing it.
I think its a bit unfair to say they got lazy. They just shifted their development to lower the priority on optimization since even though corporate Game development sucks I don’t think I’ve seen many “lazy” game devs. Many of them work pretty hard jobs for shit pay at least compared to other programming fields (Rough crunch periods, most of their audience hates them, etc)
Absoluteley, any lazy gamedev would just quit, get a boring SWE job and work fewer hours for twice the pay.
No. This is a city builder, not cyberpunk. This is a cpu optimization problem, not a gpu one. Also, a lazy development team doesn’t add enough features for it to suffer from such major optimization problems.
deleted by creator
Oh shit, I actually missed that last part of the headline. Mea culpa.
I just built a 7800x3d RTX 4090 build so I’d expect to hit 4k 60fps but I’m more a 1440p 240hz guy. I guess I’ll settle for whatever I can’t get with this game lmao. At least it’s on game pass.
at 4K/High Settings
Do you believe 90% of gamers will be playing at 4K/High settings?
… on AMD’s most powerful GPU.
I mean… At the current state of the game, 0% of gamers will be playing at 4K/High settings.
I don’t know what “high” refers to in this instance, but in general I kinda wish every game had their very highest settings targeted to future hardware. Not by necessity of bad optimization, but simply because it feels stupid playing older games that cap render distances, LoDs, foliage amount crowd sizes, lights, shadow qualities etc to hardware limits that were set a decade or two ago.
Just make it obvious and don’t call it “Very High” or “Ultra”, but directly just “Next-Gen” or something in the settings and have it target like 720p 30fps on a 4090.
That’s basically what Crysis was when it released, so yeah why not?
Because Crysis for its time was breaking barriers in terms of graphics and physics. City skylines 2 doesn’t even look that good (graphically). So it just comes down to poor optimization that will get fixed after half a year to a full year of patching. This isn’t a great look even though they said “But we said it will perform poorly”.
“PURR URPTURMIZURTION”
Or… here’s a fucking idea… it’s a CPU bound game and not GPU bound. FUCKING WOW, WHO WOULD HAVE THOUGHT that the simulation game may not be graphically amazing but will wreck the shit out of any CPU with its simulation routines?
Only everyone that’s ever played any sort of in-depth simulation, that’s who.
You would have a point if you couldn’t increase your FPS by 20 fps by disable clouds, volumetric fog, lower LOD to the bottom. Also wouldn’t the FPS get better with increasing the resolution since you are putting more work on the GPU instead of the CPU?
You don’t even have the game and you are shilling for it super hard for some weird ass reason.
Crysis was the game that got me to stop being lazy and finally build my first PC.
I dont get why people are mad about this. I’m happy that games are coming out that destroy top setups today because that means they will be beautiful (hopefully that’s what they are with max settings) with future hardware.
The issue is when the game is destroying top setups because its poorly optimized and full of bugs, and I dont think it was their idea to do a game for the future hardware because that would not be comercial viable.
You clearly never played CS1 on release.
That’s why you don’t run on high settings you fucking genius.
Its a single player game, who gives a shit how someone run it. If someone is spending 1.8k on just 2 parts I think its fair to hope a game will run “well” like this is abysmal.
Welcome to PC gaming. New here?
Nope, its pretty rare for games to release in such a horrid state where even top of the line stuff isn’t able power through it. Typically its the midrange/low end cards that are stuck with horrid frames and rely heavily on DLSS/FSR (even though that is annoying). The meme of “Can it run Crysis” shows how rare it is for a game at its highest be literally unplayable with modern hardware.
You don’t pay much attention do you?
Here’s an eight year old thread where a bunch of REEL GAMERZ MAN whinge about the same shit as now.
https://forums.tomshardware.com/threads/what-is-happening-with-recent-aaa-games.2866464/“Can it run Crysis” stuck around because Crysis was built for hardware that didn’t even exist yet, but was scalable enough to be played on then-current gen. It wasn’t because poor performance was rare.
Go to literally any major release’s Steam review page and you’ll see review after review shitting on the performance. Almost like there’s no way to fix shit you don’t know is broken.
Again my literal point was typically poor performance is to the mid range and low range rigs. This is a literal new release that as the fucking title of the fucking post says is
“Cities Skylines 2 reportedly runs with 7-12fps on an Intel Core i9 13900KS with AMD Radeon RX 7900XTX at 4K/High Settings”
7-12 FPS on top of the line gear is fucking stupid. Even Jedi Survivor wasn’t this bad and that was also a game that had “poor” performance for top tier gear.
Lower resolutions and graphical settings exist.
You have no point.
Shut the fuck up.
GPUs have been general-purpose many-core processors for like fifteen years now. Please stop designing simulations that run exclusively on CPU… and especially stop tying your simulation speed to the goddamn renderer.
I’ve written a 60 Hz renderer for a game that’s allowed to chug while you glide smoothly through it, and I’ve written a 60 Hz physics engine for a game that gets fractional frames per second, and I’m just some schmuck. What is your excuse? What kind of NP-hard nightmare did you design for yourselves, instead of identifying bottlenecks and faking the hell out of them? SimCity could run on 8-bit microcomputers. You cannot possibly be struggling to reach an acceptable minimum complexity, using hardware that’s forty years newer and ten thousand times faster.
I’m pretty confident the game isn’t tied to framerate, and also the game is almost always GPU bottlenecked from what I’ve heard. From what I’ve watched of the game, it has a ton of compute shaders and other shader work. In particular, weather is apparently a large cause of framerate issues, particularly temperatures. That’s because (I’m betting) the game is computing temperatures on the GPU and using that to draw snow and other things on the terrain and also structures. I’m pretty sure they know what they’re doing. They just did too much, and now they need to try to optimize it.
They don’t work on a custom engine. So they don’t have engine programmers so they don’t know how or probably can’t (i don’t know how you would do that in Unity). That’s the price you pay for ease of development
If Unity is so dodgy that a team of professionals can’t figure out how to spread a workload over time, they should have written it off immediately. The nature of their game was not a surprise. They’re not naive in this genre - it’s a direct sequel.
Unity has its issues, especially lately, but the engine is perfectly capable of doing what they want to do. Most of the GPU load is likely just HLSL that’ll run pretty much (if not exactly) identically no matter what engine you’re using. UE would be a horrible choice for the game, especially 5. They could make their own engine, but no way they do that and don’t have to cut at least half of the features of the game.
The issue is just they did too much without enough time to optimize. They said they’re working on it and they aren’t happy with where it is. They’ve earned my trust, so I’ll take their word that it’s being worked on. Don’t just assume they’re telling the truth, but give it time.
They could make their own engine, but no way they do that and don’t have to cut at least half of the features of the game.
As opposed to now, where they also can’t handle the features of the game.
Unity being famously jam-packed with features relevant to a cutting-edge city simulator.
No kidding they’re going to make it better, over time, but there is no way this snuck up on them.
Am engine is just a set of tools stuck together. If they have to write their own rendered, editor, interpreter/compiler, and everything else, that’s a ton of investment that you then can’t spend on game features. You don’t do that unless you have very good reason. You are also required to maintain them yourself. You don’t just get upgrades essentially for free as the engine updates.
Unity actually does have a lot of features that are useful for a city builder, like ECS. I don’t know if you’re trying to be sarcastic.
The performance probably didn’t “sneak up on them” but they almost certainly didn’t know how it’d end up. There’s likely still a lot of optimization left in there and a lot of optional things that can be enabled/disabled. There’s no way to know how the end product is going to look until you’re nearing the end and all the pieces come together and time starts running out.
“This commercial engine cannot handle our game” is a pretty good reason.
However well ECS does its thing - it’s obviously not doing enough, for this game. Even if this somehow came on by degrees, and performance got just the tiniest bit worse day after day, the time to stop tolerating that and pursue performance was a year ago. That doesn’t rule out bugfixes. That doesn’t rule out new systems. That doesn’t rule out major changes.
But they’ve surely been fighting Unity for a long damn while and this is the best they could do.
“This commercial engine cannot handle our game” is a pretty good reason.
The “engine” isn’t at fault. You can continue to add things that consume resources and you eventually use up too many resources. That’ll happen on any engine.
However well ECS does its thing - it’s obviously not doing enough, for this game.
I believe you’re wrong here actually. By all accounts I’ve heard, it’s GPU bottlenecked even with increased entity counts over C:S1. It’s likely just too many shaders doing too much work too frequently. Weather and temperature are both apparently big hogs, which to me looks like the perfect opportunity for shaders to handle a lot of the work and I’m sure that’s what’s happening.
Don’t worry. They’ll release optimization as a DLC.
It’s known performance will be poor, but if it was that bad the ton of YouTubers doing their preview coverage would have been reporting it.
No, there was a performance embargo for reviewers that wasn’t lifted until after the developers had made their statement a few days ago.
It was pretty funny seeing stuttery footage on 60fps YouTube videos without any acknowledgement from the player lol
And they thought that just ignoring such clear issue was a good approach to take? Wow that’s fucking scummy on both sides
There is a reason for them to not report on it. They were still working on the game (and they still are even). They don’t know how the performance will end up at release until it’s there. Reporting on it too early just misinforms people.
Well, surely if they’re playing it 2 weeks before it’s due to launch and it runs like garbage, they’d think “hmm, maybe this won’t be ready in time. I should probably tell people about it” rather than just being greedy and sweeping it under the rug. Also, you can be honest about issues you experience with the people watching your content. If it gets better before it’s released, you just make an update video stating you’ve seen an improvement over time. No need to hide it
I’ve watched a good bit of the game so far. I don’t think anyone hasn’t discussed performance. It’s not something being hidden, it just isn’t where it should be or where they want it to be, and they’ve been clear they’re going to continue working on it to get it where it should be. They just can’t hit that target for launch. Delaying it wouldn’t be great either because plenty of people will be able to run it, just not as well as it could be. That’s OK in my opinion.
I think they did well tbh. They of course hoped they would be able to resolve it before release, so it shouldn’t be the focus on reviewers.
At least reviewers made it clear that there still was something to be said about performance, but it would have to wait.
And Colossus then made a clear statement on performance themselves up front of the launch date.
They do. Look at city planer plays video. According to this, I can hope to get a bit more then 30fps at 1080p with medium details with my 4070 ti and 7600x. Beyond that, I‘ll get a slideshow. For most PC owners out there, the game will be unplayable in it‘s current state.
Most YouTubers have beefy rigs. Also, the preview build could have some kind of limitation which was never intended for the final release but which improves performance
Probably will trial it and then wait for sale. By the time it goes on sale, it should run better lol
So, this is releasing on Playstation and Xbox aswell? How the fuck will they be able to run it -at all-, 480p and 30 fps on low?
Maybe it is the reason that console release was pushed to next year
deleted by creator
It will run at a glorious 60fph.
I mean, I doubt that concidering PS5 and Series X has roughly the performance of a RTX 2070, while Series S has roughly that of a GTX 1650. If that’s anywhere near the truth there’s something seriously wrong with the game design. That’s about 150% difference in performance compared to the hardware in this post.
Edit: yes I know I misread his joke, I addressed it further down.
I must admit that I didn’t do any research, my previous comment was meant as a joke.
Dammit, flew right over my head
I wonder what I did wrong on the delivery of the joke.
He said fph not fps
Linked article is nothing but Unreal Engine fanboy masturbation.
Those people are fucking weird.
Also a bizarre comparison. Cities 2 is a simulation game - they are very CPU intense games. The graphics are nice but it’s likely it’s problems with balancing the CPU demand and the graphics that is the problem, rather than the graphics themselves. Simulation bottlenecks will drop the FPS drastically, regardless of the graphics engine.
From what I’e seen of the game on Twitch, I think the performance issues aren’t game breaking. It seems the game runs fine if you reduce settings; while it’s far from ideal it looks playable.
But it will be damaging for the game. Mods won’t launch until after the game is launched, and that may be delayed further by time taken fixing the game post launch. For a game that suceeded in a very large part due to user content that may really harm the game’s success.
Linked article is nothing but Unreal Engine fanboy masturbation.
UE having better 3D performance than Unity isn’t really that much of a hot take. Unity got that much traction because of its really favorable licensing terms before the recent change.
Have you seriously not noticed how there’s this weirdo subset that feels the need to throat UE every chance they get?
Why does the author specifically mention UE and not any of the other engines on the market? Why not Source, which is renowned for being one of the most flexible and performant engines out there?
UE fanboyism is real and it’s fucking weird.
Have you seriously not noticed how there’s this weirdo subset that feels the need to throat UE every chance they get?
No, I didn’t but I’m also not diving into every tech subculture.
Why does the author specifically mention UE and not any of the other engines on the market?
My guess is because of its versatility and the rumor from a couple of months ago that Cities 2 was UE-based. As an open source proponent myself, I would like to see the CryEngine-derived Open 3D Engine (O3DE) or Godot to gain more traction but at least the latter is still lacking on some features other 3D engines have for ages – I seem to recall to read a few weeks ago that shader stuttering is still a thing even in the newest Godot release. I don’t think any shipped product is using O3DE, so using that would be a big gamble for a relatively small development studio.
Why not Source, which is renowned for being one of the most flexible and performant engines out there?
Source 1 is pretty outdated and Source 2 is used by Dota2 and two FPS games.
The worst thing is when people who have absolutely no clue about game development believe advertisements and hype and get really opinionated about stuff they know nothing about, such as engines, tech or techniques.
Also worth noting that Skylines 2 comes out on Xbox game pass day 1. You can usually pick up a trial for a fortnight, that’s a pretty perfect opportunity to try this on PC (to see how bad it runs for you)
That’s what I’ll be doing, trying it out and most likely skipping it for a few months while they polish it up