• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

No Man's Sky PC Version doesn't work on older CPUs

Aselith

Member
I know im gonna get a lot of flak for this, but seriously - don't cheap out on CPU's, yeah you can technically get away with lower end CPU's, but don't. The amount of people sticking 980+ power GPU's with either an AMD CPU, or a very old intel (i7 920) is ridiculous. Ever heard of bottle-necking? I'd bet in a few cases atleast a cpu/ram upgrade and a lower end GPU would provide better results for some users.

Lets face it guys, you didn't buy an old AMD CPU because it was the best, you got it because you were cheap, and you obviously get what you pay for.

Now this might have had SSE4.1 flags enabled on compile, even though it is not required, so disabling that flag and re-compiling might fix it, or SSE4.1 might be required (lets face it, its a CPU heavy game - and by that i'm not talking about pure power, but complexity)

After all what do you think instruction sets are? Miscellaneous fluff just to artificially inflate a games requirements? No, they give the CPU a helping hand at doing incredibly complicated instructions quickly. It's probably a compile flag that can be disabled and isn't a hard requirement, but who knows for certain until a patch it out?

I don't have trouble running games. There's no bottleneck to fix....

The only games I really occasionally have trouble with are poorly made Early Access games.
 

grmlin

Member
I could run everything until now on my ancient 1055t + 7950 combo. Sure, not at the max settings and 4k, but I'm used to my PS4 and my PC still performs better most of the time.

It's not like this game looks better than some of the other games I played in the past.
 
I'll bite.

What are you even talking about? I bought the Phenom II seven years ago and it's been running every game no problem since. NMS is merely the bad port exception that confirms the rule. I fully expect other games to keep working flawlessly and will only upgrade once games start demanding too much power or the hardware breaks down.

Same. One guy on Steam forums was harping about CPU is outdated and it should be upgraded for CPU intensive games like Fallout 4.
Recorded a gameplay on ultra settings for him.

https://youtu.be/SXY4AY__JRU

Then he was like average fps was mid 50s.
Pretty sure I would have averaged above 60 if I had turned off vsync.

Better CPU will definitely give more fps. But if 99% of games runs at 1080p60fps at settings that are way better than console settings, what's the point of buying a new CPU, motherboard and possibly RAM. Only to play it at 100 fps?
 

Lkr

Member
I agree that the fact this doesn't run on a phenom II is bullshit, but at the same time I'm glad I upgraded to i5 3570k a few years back. I don't understand how anyone in this thread is defending this game not running on these CPUs when the devs never said it wouldn't.
 
I agree that the fact this doesn't run on a phenom II is bullshit, but at the same time I'm glad I upgraded to i5 3570k a few years back. I don't understand how anyone in this thread is defending this game not running on these CPUs when the devs never said it wouldn't.
Because the fanbase for this game is probably the ugliest of seen in a non multiplayer(lol) game in possibly forever. They are currently behaving like a dog with rabies.
 

lmimmfn

Member
Thanks for the tip.
My second system needs an upgrade and the 5670s I find on ebay will do nicely.
No probs, its a great upgrade, higher over clock, less power and extra cores, do check your mobo compatibility though, although my Gigabyte x58ud5 v1 doesn't officially support it, it works fine.
 

Fafalada

Fafracer forever
blu said:
Point being, if the game originated on the ps4, it's not entirely improbable some sse4.1 code sneaked in.
Oh it's entirely possible - but my point was that even Sony's internal libraries have non-SIMD fallbacks, short of some special piece of handcrafted code, you're extremely unlikely to find something that comes only with SIMD variant, and equally unlikely to write new code that way from scratch. Ie. they probably have an easy workaround - even if it comes with some performance trade-off.

Aside from the GPGPU part which I generally disagree with (whether GPGPU would be of use largely depends on the latency requirements of the situation and where the data were heading to, among other factors)
I didn't mean to come across as generalizing - I was alluding to the scenario mentioned by someone earlier about possibly writting "large" amounts of hand-crafted intrinsic or inline-asm code, which would have questionable benefits in of itself. But if we "are" talking about a larger algorithm that is dominated by vector-math costs, there's a reasonably good chance it will be possible to shuffle it over to GPU (and on a console, you will have much finer control granularity over latency and synchronization of async-GPU jobs).
 

dogen

Member
Oh it's entirely possible - but my point was that even Sony's internal libraries have non-SIMD fallbacks, short of some special piece of handcrafted code, you're extremely unlikely to find something that comes only with SIMD variant, and equally unlikely to write new code that way from scratch. Ie. they probably have an easy workaround - even if it comes with some performance trade-off.


I didn't mean to come across as generalizing - I was alluding to the scenario mentioned by someone earlier about possibly writting "large" amounts of hand-crafted intrinsic or inline-asm code, which would have questionable benefits in of itself. But if we "are" talking about a larger algorithm that is dominated by vector-math costs, there's a reasonably good chance it will be possible to shuffle it over to GPU (and on a console, you will have much finer control granularity over latency and synchronization of async-GPU jobs).

They said they had to rewrite "thousands of lines of assembly".
 

Narroo

Member
\

Are the system requirements for No Man's Sky inaccurate?
\.

They're vague: What does 'i3 minimum' even mean? I supposed if I bought a CPU from the same line as the i3's I'd know if I hit the minimum or not, but what if I have an AMD processor?

It's possible to have a CPU that generally outperforms the the 'minimum requirements,' as some other posters here mentioned, but does not support the 4.1 instruction set. In that case, what does 'i3 minimum' required even mean? How is a person supposed to know that it means that you need to support a certain CPU feature most people haven't even heard of.

Worse, it's possible that the game doesn't even need the required feature. It's fairly easy when programming to put in something that relies on some new standard without really using it, in which case you break compatibility for no good reason.

While they may not need to support an old standard, If you can preserve compatibility easily enough, there's no point excluding hardware for the sake of being old. The general ideal is that programs should be compatible with CPU's as long as the CPUs can handle running the program. This might not always be viable, but it's an ideal. So the question is, does No Man's Sky really need SSE4.1?
 
Last version.

2011.2.0 released on September 14, 2011.

Now let me find documentation to see if requires SSSE3.

It doesn't. 100s of games released after that which use Havok and run fine on AMD phenom II.

They said they had to rewrite "thousands of lines of assembly".

Frankly, I don't believe them. They just transferred/badly ported PS4 version to PC, didn't test it properly and are now making excuses.
They are a small team. They should have taken their time and released the game after like 6 months on PC.
 

ethomaz

Banned
It doesn't. 100s of games released after that which use Havok and run fine on AMD phenom II.
That is not how it works.

When you use an API there are different function/calls that can require more specific instruction set than others... what you do in most cases is avoid these for compatibility.

So others games running Havok on Phenon II means nothing for NMS case...

BTW instruction set use great boost the performance of the code that is how Havok can match the physics in GPGPU on CPUs... they heavily rely on SSE2 and SS3... I won't be surprised if they use SSSE3 for specific cases.
 
Eventually everything will be obsolete.

But this, there is nothing in NMS to justify the incompatibility. It's doesn't perform exceptionally (well), nor does it have anything exceptional in terms of technology. It's literally just a survival sim with procedurally generated levels.
 
So... The PC version runs the PS4 SDK in the background according to the data mining thread... Could the SDK require the instruction set?
 

ethomaz

Banned
So... The PC version runs the PS4 SDK in the background according to the data mining thread... Could the SDK require the instruction set?
Jaguar cores indeed support SSE4 and SSSE3. Unless Sony SDK is unoptimized it uses these instructions because it is faster than to the same operation via low level code.

There is no beneficies to not use them in a fixed platform like PS4.
 
Jaguar cores indeed support SSE4 and SSSE3. Unless Sony SDK is unoptimized it uses these instructions because it is faster than to the same operation via low level code.

There is no beneficies to not use them in a fixed platform like PS4.
But if the PC version is running the game in the PS4 SDK, Hello Games may not be able to change it?

I'm asking because I don't know ho that works.
 

ethomaz

Banned
But if the PC version is running the game in the PS4 SDK, Hello Games may not be able to change it?

I'm asking because I don't know ho that works.
They need to code a slow path without use these instructions and seems like they did that with SSE4.1 but SSSE3 is being used by some call in Havok that they can't recode themselves.
 

ethomaz

Banned
SSE 4.1 is the reason why older C2Q 9xxx still trump newer AMD CPUs on some emulators.

Just saying.
Any code path using Streaming SIMD Extensions (SSE) will be faster than direct code.

Eg. A single 128bits SSE instruction can replace 4 four scalar addition instructions and that is a way simplified example... there are more complex that give more performance boost.

It is really great to use SSE instead direct code.

The only down back to use SSE is compatibility but to be fair every CPU since 2011 have SSE up to 4.1 (even VIA Nano supports SSE4.1).
 
Any code path using Streaming SIMD Extensions (SSE) will be faster than direct code.

Eg. A single 128bits SSE instruction can replace 4 four scalar addition instructions and that is a way simplified example... there are more complex that give more performance boost.

It is really great to use SSE instead direct code.

The only down back to use SSE is compatibility but to be fair every CPU since 2011 have SSE up to 4.1 (even VIA Nano supports SSE4.1).

How much better is SSE 4.1 than SSE 3?
 

ethomaz

Banned
How much better is SSE 4.1 than SSE 3?
Each version of SSE have a set of faster SIMD instructions... each one is unique and has a purpose. SSE4.1 is not better than SSE3... it just do different things.

SEE + 70 initial instructions
SSE2 + 144 new instructions
SEE3 + 13 new instructions
SSSE3 + 48 new instructions
SSE4.1 + 47 new instructions
SEE4.2 + 7 new instructions

Edit - Here a graphic of Sandra making use of SSE2 and SSE4.1:

Sandra11.png


IntelISAExtTable.jpg
 
well if Sean's tweet about a potential Fix for PC users isn't a steaming lie, then soon the game will be patched.

Us stubborn Phenom II users (myself included) can trudge on using these 5 year old high-TDP and increasingly bottlenecking CPUs until a game comes out that just straight up kills our CPUs, and all this SSE talk will be forgotten. :)
 

ethomaz

Banned
well if Sean's tweet about a potential Fix for PC users isn't a steaming lie, then soon the game will be patched.

Us stubborn Phenom II users (myself included) can trudge on using these 5 year old high-TDP and increasingly bottlenecking CPUs until a game comes out that just straight up kills our CPUs, and all this SSE talk will be forgotten. :)
I'm curious about what he is talking that will make PC and PS4 owners happy... PS4 has no issue with SSE because it supports all versions... so maybe it is something else being fixed (multiplayer???).
 
Us stubborn Phenom II users (myself included) can trudge on using these 5 year old high-TDP and increasingly bottlenecking CPUs until a game comes out that just straight up kills our CPUs, and all this SSE talk will be forgotten. :)
Really you should consider them ten years old, tech-wise, seeing as how they don't support an instruction set that came out in 2007. :p
 
I already refunded my game, i have a Phenom II. If i rebuy the game to see if it works on steam and it doesnt can i still request a refund again?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Any code path using Streaming SIMD Extensions (SSE) will be faster than direct code.
That's not true. On 64-bit SSE implementations (i.e. most early desktop CPUs and not-so-early mobile CPUs) doing fp64 SIMD is most of the time slower than fp64 scalars.
 
What CPU/GPU combo do you use for 60fps? Thanks!

AMD 1055T OCed to 3.5
GTX 1060 with latest nvidia drivers.
Fps is 60 on planets and while flying in space.
While entering planet atmosphere and landing, it drops to 40s.
Forced AF, quality texture and adaptive vsync from nvidia control panel. Game looks nice.
 

Jinkies

Member
This also happened with Earth Defense Force recently.

While it appears easy enough to fix, apparently support for non-SSE 4.1 CPUs is being phased out.
 
This also happened with Earth Defense Force recently.

While it appears easy enough to fix, apparently support for non-SSE 4.1 CPUs is being phased out.

It got fixed. I don't think they will phase out capable CPUs and lose potential buyers if the fix is easy enough and there is no performance or other benefit.
 
Makes me wonder how long before games require AVX. Anything older than Sandy Bridge or Bulldozer would be screwed

Since they fixed SSE 4.1 and SSSE 3, and game runs like a charm after that, I don't think they will force something else and alienate older CPUs. This was definitely an oversight by the devs.
 

dogen

Member
It got fixed. I don't think they will phase out capable CPUs and lose potential buyers if the fix is easy enough and there is no performance or other benefit.

In other games there could be a large performance benefit though.

Honestly, you don't know there's no benefit in this game either just because your cpu can run it at 60 fps.
 

grmlin

Member
AMD 1055T OCed to 3.5
GTX 1060 with latest nvidia drivers.
Fps is 60 on planets and while flying in space.
While entering planet atmosphere and landing, it drops to 40s.
Forced AF, quality texture and adaptive vsync from nvidia control panel. Game looks nice.

Thanks! I think I'll skip with my 7950 and get the PS4 version instead.
 
But if 99% of games runs at 1080p60fps at settings that are way better than console settings, what's the point of buying a new CPU, motherboard and possibly RAM. Only to play it at 100 fps?

Because 100+ fps is infinitely better than sub-60 or "almost 60". But yours is a fair point, actually.

No need to upgrade the CPU if you can still comfortably play newer games.

That doesn't mean though that you have to completely cheap out and buy old and poor hardware (an argument that can't apply to existing gaming rigs; you use what you have).
 
Top Bottom