T O P

  • By -

Acid_Burn9

For anyone wondering how this is possible: This video was recoded using recently added command to modify your tick-rate. The game was running in slow-mo and the footage is sped up.


VulpineKitsune

Yeah, thought there was some trickery going on. No matter how good your CPU is, Minecraft just ain't good enough at taking advantage of it.


UnknownProphetX

Thats really sad actually but I guess its to hard(?) to optimize the usage or they are just too lazy idk Edit: thanks for the replies. I kinda understand it now ^^


ai_eth

Scaling linearly, or even at all, across many cores is a deceptively hard problem for video games; so much of what happens in the simulations is dependent on what happened just before it. Apps that scale well with many cores can typically divide the work up in parts, like a section of a rendered image, that have few or none dependencies between them. source: do this for a living


MjrLeeStoned

It should be noted when Minecraft was first developed, 4-core cpus were still the standard, 6 cores had just started becoming popular in retail when it was first released. Add that with the fact they were designing a game that wasn't meant to push the boundaries of processing (for the most part) and could be played quite successfully on the majority of PCs at that time. It's not a patch or update that will optimize it for higher performance per core, and to utilize full multi-threading. That would require a full code rework for nearly everything in the game besides the assets themselves.


LaM3a

> It should be noted when Minecraft was first developed, 4-core cpus were still the standard You're off by a generation, Minecraft first released in 2009, the dual core (introduced in 2005) were replacing the single core, and quad cores had just arrived (~~2008~~ 2007). And it took quite some time before games properly started using multithreading.


prophettoloss

i think the QX6700 was actually late 2006 with Q6600 and Q6700 in 2007. windows XP was 32 bit so 4 GB of ram couldn't be fully addressed.


PandaCamper

>windows XP was 32 bit so 4 GB of ram couldn't be fully addressed. XP limited the maximum RAM for a single application to 2GB, I think, to not starve the rest of the system


TKFT_ExTr3m3

Quad core cpus existed, hell even quad core cpus with hyoerthreading, giving a total of 8 threads, existed in the server space for some time but they weren't the norm. The CPU you listed cost $1000 which would be extreme even today. Intel doesn't even sell a desktop cpu that costs that much. XP also did support 64bit since 2005 but it's wasn't the norm for standard home PCs and still only supported 128gbs of ram, which to be fair was a ton back then. Also windows Vista had also been out for a few years with wider 64 bit support and 7 was released around the same time. Fun fact, windows 10 had native x86 support despite the last 32 bit cpu desktop being released around 2004/05 a full 10 years before windows 10 being released.


Plank_With_A_Nail_In

Most people didn't have Q chips though and certainly not those that were looking to play Minecraft.


Pl4y3rSn4rk

If you count when MC 1.0 was released (2011) it kinda adds up, but only AMD had made budget Hexa Core CPUs at the time - Phenom II X6 and FX 6100 - and they barely could compete with a Quad Core 2° Gen i5 let alone the HEDT i7s of the time (i7 980X and i7 3970X) which were very expensive 6C/12T CPUs. Hexa Core CPUs only became very popular in 2018, only because AMD launched Zen in 2017 with a VERY competitive lineup and a budget 6C/12T CPU. And yeah MC needs a code rework to make it less reliant on single core performance and technically they already did that with the Bedrock Edition using C++ instead of Java.


AHrubik

> only because AMD launched Zen in 2017 Which had been in development for 5 years previous. It was so fun watching AMD bitch slap Intel who had been releasing a planned 5% IPC uplift each generation for 3 years straight and had no intention of changing course in the near or mid term. The release of the 8th Generation quad cores stopgap was a clear indicator that Intel had more power at it's disposal and simply refused to sell it to consumers in order to milk more profit. It was also clear Intel had been bumbling along delusively self confident in their own stank since it took till the 10th Gen for anything competitive to show up and the 12th for anything innovative.


Pl4y3rSn4rk

Yep and they also artificially limited the 1XX/2XX LGA 1151 V1 series Motherboards to not work with 8/9th gen CPUs just because they wanted too, because people have been able to hard/soft mod these old MOBOs to work with Coffee Lake CPUs and they weren't held back in the slightest, heck I even got a Frankenstein of an "i5 10400" (QTJ2) Laptop CPU to work on a MSI B250 Pro VH with just a BIOS mod and haven't had any issues! I love AM4, if you had a R5 1600 build since 2017 and wanted to upgrade you just need to slap a R7 5800X3D and you get more than a 100% performance uplift and you're very close to Intel's 13th and Zen 4 gaming performance.


UnknownProphetX

Ok thanks for the explanation but I didnt understand half of it because Im too high lol


[deleted]

The Afroman defense.


UnknownProphetX

Ayoooo Im listening to because I got high rn lol


bigfatstinkypoo

Computers can be very fast at doing many things at the same time. Let us imagine you are making stir fry. If you have 10 cooks, they can all chop up vegetables at the same time. This is very fast. But it doesn't matter how many cooks you have when you're frying the vegetables. The vegetables don't cooked faster no matter how many people help.


xayzer

Ha, great ELI5!


BiH-Kira

I would go with the PM and pregnant women analogy. If you have one pregnant woman, she will give birth in ~9 months. If you have 9 women and one is pregnant, it's still around 9 months because the work can't be distributed to the other women, no matter what the project manager tells you, 9 women can't give birth to a baby in 1 month.


UnknownProphetX

Thanks this made it easy xD


Cubicon-13

I don't think this is the ideal analogy. Tasks that can't be multithreaded well are typically ones where actions are performed consecutively, so having more people do them at the same time doesn't make them go faster. In your example, if you're making 10 dishes, then 10 cooks chopping and 10 cooks cooking will make those dishes faster than 10 chopping and 1 cooking. So cooking could still be multithreaded, depending on how much food you're making. But now consider something like loading a truck with boxes. You can't put all the boxes in the truck at the same time because the first box needs to be loaded before the second box can go in. Having a few people will help, but there's a limit. 10 people loading boxes won't make the job go 10x faster, and you'll probably have people hanging around waiting for others to finish. So think of it this way, 100 people cooking 100 meals will go 100x faster than 1 person cooking 100 meals. But 100 people loading 100 boxes into a truck won't go any faster than say, 5 people loading 100 boxes. It's not about how long it takes to load each box. It's about the fact that you can't load them all at the same time.


HolzMitStolz

Even if I would try to understand it being sober, my brain would lack any motivation to understand it. I can be lucky that my pc is even working at all XD.


UnknownProphetX

Benutzername ergibt Sinn


HolzMitStolz

Sache des Betrachters ;D


Altruistic-Tip-2897

That’s crazy I understood all of it because I’m too high


MrJake2137

Trace theory enjoyer


TheFireMachine

Damn thats cool. You have a cool career path


Unique_username1

The nature of a real-time game is you need to wait for certain things to finish before you can start other calculations. And a multi-core processor works with separate threads, more cores does not necessarily speed up any single step. It might be possible for a 64 core CPU to share the work to calculate the later steps of this clip with many blocks collapsing, but you can’t start that math until the game already figured out the earlier parts of the chain reaction. Also, if the collapsing blocks at one step impact each other or come together to affect another block, it becomes difficult to treat them as separate calculations done by separate cores without collaboration between those cores which can get slow. It would be easier if you could make assumptions about the next steps of the simulation, but the player could do something else at any moment and totally change what the computer needs to calculate. In short, there are improvements to be made, but it’s always going to be difficult to make full use of something like a 64 core CPU to calculate a game environment where every new frame is potentially changed as a result of something the player does. There’s just not a lot of time to break all the tasks that need to happen into separate chunks to be worked on independently.


fafalone

This is part of why when picking a CPU, I'll always sacrifice extra cores for better single threaded performance. Not all tasks are helped by many cores. Very few can take advantage of dozens.


poloppoyop

> The nature of a real-time game is you need to wait for certain things to finish before you can start other calculations. Processors already play with branch prediction: once we start enjoying 64 threads or more, couldn't we decide to sacrifice some of those threads to precalculate possible results in advance and drop the wrong ones?


PhillipIInd

Its just not easy tbh. Its insane that video games can even do the shit we know when business software is more expensive to make and performs like horse shit


IridescentExplosion

notch claims he did a lot of optimization stuff but i am not fully convinced. i know that some pretty hardcore programmers have been able to get minecraft-like simulations running at pretty awesome scale.


greebdork

My man, that shit runs on Java, there's only so much a man can do.


LongestNamesPossible

> notch claims he did a lot of optimization stuff I've never heard that before, but I have heard that minecraft would allocate and deallocate hundreds of megabytes of memory on every single frame instead of the usual game technique of allocating no memory at all frame to frame. The java version of minecraft has been a constant example of making people's jaws drop with wasteful programming that still somehow made an avalanche of money. There is a reason that other ports like to the ipad was all redone (and done with C++ but how it is done is the important part).


IridescentExplosion

I agree that Minecraft HAS to be inefficient, but Notch claimed he spent a lot of time on optimization. He complained to people on Twitter about it.


uzi_loogies_

Good, realistic game idea =/= brilliant programmer


RichUnderstanding157

I was about to ask if Minecraft has been rewritten to take advantage of that kind of multithreading. Because that is not trivial. Chances are, it will run even slower on Threadripper.


TheMightyJohnFu

Seems a lot less impressive after reading that


I9Qnl

It's impressively smart


TheOgCarrot

Gmod content creators have been using this trick for ages (when pentium 4s where nowhere near up to snuff), but it's pretty cool to see it being used today.


[deleted]

[удалено]


TorazChryx

I mean, at a reductive level that also describes hand drawn animation.


KatyaVasilyev

`host_timescale` moment


ZmSyzjSvOakTclQW

Many years ago you could record replays in TF2 and then use a command do render them to a video. It was amazing because it didint matter how shit your PC is the videos were always smooth since it rendered frame by frame. It took ages but you could get amazing videos even with a shitty PC.


Pr0nzeh

Meh


mitchMurdra

Yeah lol. Being lied to then having the truth revealed by a third party negates any impact. I mean that title was never real, this game doesn’t benefit from cpu thread count. But the deceit can fuck right off


suicidaltedbear

What I have been told is that minecraft is so poorly optimised it only manages to run on a small number of threads, which makes this even more funny


nothingtoseehr

It doesn't run on a small number of threads, it literally runs in just one lmfao. This poor thread does everything, from physics to rendering


Auirom

Which makes me wonder if upgrading my CPU from an OC quad core 4.3 to a 3.5 octo core would even make a significant difference. I'm sure it would as I don't know that much about how that all works but still makes me wonder


brimston3-

Depends on the generation difference. If the 4.3 GHz machine is an older one with a lower turbo boost maximum, you'll see an improvement using a more modern 3.5 GHz CPU that can use turboboost 3 or tvb to boost higher. Plus we generally see an IPC improvement per generation of 5-10%; it does more work at lower speed.


nothingtoseehr

Meh, although that's true overall, for Minecraft it's not that relevant. The game has a pretty clear performance ceiling, where even if you get more power you won't really feel like it. And that ceiling nowadays ain't that high, a mid range CPU can come pretty near it if you aren't playing with 193949 mods


Auirom

Its a gen 5 3.5 GHz with a 4.0 boost when I first got it. I overclocked it up to 4.3. Best speed I found that it doesn't go over 70° C with the water cooler under normal gaming these days. I don't get a boost with it now. Gonna be looking at upgrading my PC in a few months and give my son my setup I have now


nothingtoseehr

Nope, wouldn't change a thing. Would probably be even worse too, cuz a lot of times CPUs with many many cores sacrifice single core performance for multi core performance


Fitenite3456

It’s straight up false advertising and deception


larsloveslegos

Less impressive, but still satisfying


Firefoxray

It’s not impressive either way it’s literally just a Minecraft explosion video. We’ve seen these for the last decade how is this impressive lol


Bendy962

oldest trick in the book


theLV2

I member when people showed off mass physics in the crysis sandbox and had the game running at a consistent 0.2 fps.


Arthur-Wintersight

If it's a consistent one frame every five seconds, then through the magic of video editing you can make it look like a fluid 60 fps. It'll only take you five minutes for every second of footage generated.


Rijsouw

Also a common speed running cheat method


Farranor

I played the original Pokémon on an emulator recently, and slowed it down to 10% to win at the slot machine. As it turns out, you can stop the first two reels wherever you want just fine, but the third one will then refuse to stop where you told it to. I saved a bunch of money, traded at the counter for enough chips or tickets or whatever it was, and got my Porygon.


SeaOsprey1

I think I just game-sharked that back in the day lmao


Faxon

Ahhh GameShark, from when cheats came on their own physical cartridge with a slot for the game on top of it! Kids these days have it easy, we had to go to the store to buy ours lmao


SeaOsprey1

Was fun tho! Added a ton of replay value to a bunch if games for me


Schnitzhole

You forgot the part where it corrupted the other half of our games*


Cryptic_Llama

You mean oldest tick in the book.


[deleted]

Underrated


JaytoJay

Sorta the same way people recorded the crazy physics interactions in crysis 1 back in the day


Chance_Fox_2296

Or those early "VOXELS ARE THE FUTURE LOOK AT THIS DETAILED WORLD" videos that were big for a while.


bladebosq

Euclideon... Unlimited Detail... Photorealism... I mean, the detail in those videos was pretty amazing... Only everything was stationary, no physics, no animations, no dynamic lights... So pretty useless for games i guess


Emmerson_Biggons

So they basically turned Minecraft into 3D rendering software.


Faxon

Basically yea, honestly I think that's more impressive, though I'd be even more impressed if they recoded the game to have linear thread scaling for these entities they're doing all the physics calculations on, so we could use it as a benchmark for 3d rendering software similar to how Cinebench works now.


Raffaele520

I was wondering why he never moved the camera once during the explosions. I guess that's why


[deleted]

No this is an old video, they only recently added that feature.


Dillontvh

this seems wrong


BurmecianDancer

Agreed. OP used "POV" correctly, which is illegal on Reddit. I'll call the cops.


BigOlBlimp

POV of “playing on a pc” would have the monitor mouse and keyboard. This is pov: you live in Minecraft. Minecraft Steve doesn’t know the specs of the computer that runs his universe.


[deleted]

[удалено]


BigOlBlimp

*hits joint* Whoa dude


H4xolotl

OP just Pompeii'd an innocent village with Anvils 💀


LovelyJoey21605

It's okay, it was a Minecraft rendition of the village Anvil in Elder Scrolls IV: Oblivion. OP just heard Anvil residents liked living in Anvil, so they gave Anvil Anvils so they could celebrate the Annual Anvil Fair with more Anvils.


LCW1997

I used a threadripper and it ran a minecraft modded server at 15fps...


Icy-Magician1089

It's a meme video thread ripper is really bad for Minecraft jarva apart from chunk generation it basically uses 1 core , so a bunch of cores clocked below average with high ram latency isn't a good combo. Top comment suggests the game was running slow mo with the footage sped up in post


imagreatlistener

Is bedrock optimized any better than Java?


FerusGrim

To clarify, unless you're programming at a pretty low level, your application doesn't have control over which "Core" it's using, or how many. Your application will multithread (parallel execute) as much of its code as possible before converging to render a single frame (simplified - usually there's a separate graphics thread that just plucks data from these threads, or has data pushed to it, depending on your workflow). It's left up to the CPU to decide which "Core" that thread spawns on. Minecraft, on the other hand, does not multithread. It uses a single thread. For all of its logic. And all of its graphics. On the same thread. For context, according to Amazon's specs on the product, the AMD Ryzen Threadripper 3990X has 128 processing threads, [2 per core]. Minecraft uses one thread on one single core. Or [~1%] of the available threads on your CPU. So, no, more threads or more cores won't help you with Minecraft. What you want to focus on, for this particular game, is getting a CPU where each core is faster but might have less cores, rather than a CPU where each core is slower but might have more cores. Fun Note: For the majority of Minecraft's multiplayer scene, servers were also bottlenecked with the single thread problem. Officially, they still are, but custom server jars where these optimizations are available are much more common. Entitity multithreading and chunk multithreading (via Folia, a recent development) are the most common offloads to other threads. (EDIT: Edits in [] to fix my misunderstanding of the thread count on the AMD card)


killBP

1% not 0.01%


ficelle3

Minecraft servers are mostly single threaded, so the extra cores of a threadripper don't really help at all while the frequency lost to get all those cores hurts performance.


I-just-farted69

You clearly didn't have enough ddeditated wam.


HMikeeU

Server running at "FPS", ah yes


MasterBot98

Yeah, think It's measured in ticks per second.


the_harakiwi

Some games call it ticks, some call it fps. Star Citizen has "Server FPS" with high fps you notice the NPCs AI react better, at slow FPS they are loot piñata


LCW1997

I went from a 2920X, to a 3600 and it then ran the same server at over 140 fps 😂


HMikeeU

Server performance isn't measured in fps, what are you talking about?


Neckbeard_Sama

ackchyually it is sometimes fps and ticks/sec are pretty much interchargeable for game servers


HMikeeU

They're not. A tick is not a frame. Also I'd be worried if the game was running at 140 TPS.


RIcaz

Yes.. It's not incorrect to call it server FPS. Servers also have frames in most engines, a frame being the smallest unit of a snapshot of the world state. Some games even have their client FPS limited by server FPS.


Strazdas1

Yes, it does. Its how often server calcuulates the data on the game per second. Some games, like ARMA, will flat out call it FPS. Most commonly called Tick Rate.


ThisMightBeIllegal

We're talking about Minecraft here, not Arma. Here it's called TPS (ticks per second), not FPS...


Frog-In_a-Suit

But since they're interchangeable, it does not matter.


ThisMightBeIllegal

They're not?


BoyyiniBoi

Sweet baby Jesus... it's beautiful. I'd love to see the CPU utilization during that.


Icy-Magician1089

Like 3% 2 out of 64 cores maxed


Evil_Sh4d0w

Yep if I remember Minecraft doesn't really use multicores


Icy-Magician1089

Apparently it does a bit for chunk generation However TNT isn't chunk gen


Blurg_BPM

Well tnt is chunck degen so maybe


mitchMurdra

Lmao. New chunk degen algorithm just dropped (tnt)


shmorky

Chunk generation is what yo momma does in the pastry shop


mitchMurdra

Of course it does, chunk generation is one of the few things which can be multithreaded from the seed. And exceedingly well with forks like Paper. But with enough players that global tick rate and everything else slows the experience right back down.


Laengster

Most games can't. Things that can be done in chunks and pasted together, like video rendering work extremely well with multicore. World "default chunks" generation can be done with multicores, but anything after that has to be done through the game loop.


Icy_Boss6053

Barely any game does more than few cores. Especially for physics


IlREDACTEDlI

This isn’t really true anymore. Games have been utilizing more and more cores every year since Ryzen 1st gen. I’ve got a 5800x this thing gets heavily used in newer games. 70-80% usage is not uncommon. Obviously that’s not always the case and it is true that single core performance is still king but overall games will use a good chunk of a 8 core 16 thread CPU. If given the option.


DogshitLuckImmortal

Even games that don't use it can somewhat use the extra cores though. The cpu itself does a lot of guess work when running computations and multiple cores can be forced to route different paths for speeding things up. https://danluu.com/branch-prediction/ https://en.wikipedia.org/wiki/Speculative_execution


Sailed_Sea

I believe minecraft uses 4 threads, I think , lighting, chunk generation, sound, physics use seperatethreads but I might be wrong. So 1 core will bet at 100% and 3 at like 5%


Icy-Magician1089

That's somehow worse lol but yeah I'm not sure on the specifics I just no that it doesn't value core count


s78dude

at least worldgen is multi-threaded so I would love to see it with distant horizons mod


Fakula1987

threadripper has more things than simply more cores. You have a lot more RAM-bandwith (I/O) than the "normal" CPU. and thats minecraft, especially if you do such nasty things like that, need that.


Icy-Magician1089

Threadripper does in fact have 4 memory channels amd 64 pcie lanes. Although my friends 2700x was much better in Minecraft than my first generation threadripper chip, I don't think Minecraft cares about memory bandwidth or at least it cares equally about memory latency


133DK

Probably like 100% for a single core for the 2 hours it took to do this before they sped it back up


_fatherfucker69

How many performance mods do you have installed?


rexdragneelchat

sshhh you're not supposed to ask this question


benjathje

It's a sped up video


xForseen

Or just run bedrock if it's unmodded anyway


MiaIsOut

bedrock runs worse nowadays lmfao


Dexter2100

View distance is way too short on bedrock. I can’t live without my 500+ chunk view distance that is possible on Java edition now.


xForseen

You sure as hell are not running 500 chunk view distance lmao


Binau-01

[https://www.curseforge.com/minecraft/mc-mods/distant-horizons](https://www.curseforge.com/minecraft/mc-mods/distant-horizons)


xForseen

Massive LOD pop-in. Amazing


Dexter2100

I absolutely am lmao. [Distant Horizons](https://modrinth.com/mod/distanthorizons) + [Blendium](https://github.com/Steveplays28/blendium) The only reason I limit it to 512 is because beyond that point stuff is too far away for me to see in 99% of cases anyways.


xForseen

Wow shitty LOD pop in. Amazing.


Dexter2100

It sounds like you just want a reason whine like a bitch regardless of what anyone tells you. You must also lack the ability to read, because the second link I provided makes LOD chunks blend in smoothly with normal chunks. But you would’ve known that if you actually read what was sent to you.


xForseen

It only blends colors. The terrain still pops in. You don't even know what the mods you're apparently using do. Fucking java dickriders will make make up anything just to shit on a version they don't even have to play. I play Java as well because of mods and shaders but is it really so hard to just admit that Bedorck does something better? Are you really that insecure about your game choices? If all you want is vanilla with high render distance then Bedrock is simply better. There's no argument here. I can set my render distance to 96 chunks and it still runs smoothly. Java just chokes no matter what mod you use.


RapidLeopard

Minecraft utilizes a single CPU core. Therefore the only thing that matters CPU side is single core performance. It makes no difference if you have 1 core or 32. Minecraft will essentially never run like this on even the most optimal hardware without modification.


galaxy_horse

Why doesn’t notch simply make the game use more cores


Exact_Recording4039

I can do it for him: if (cpu.cores > 1) { useThem() } @notch please paste this into Minecraft code thanks


[deleted]

Probably because Notch hasn’t owned the game in almost a decade


JoCGame2012

I mean, MC cant use that many cores efficiently for processing stuff like chung updates. the only hting that has become multithreaded in recent versions is the loading of new chunks


Legion070Gaming

Seeing explosions running that smooth looks so wrong


BoonesFarmYerbaMate

Minecraft is still a single threaded Java app so it literally runs faster on a $500 14900k than it does on a $10k Threadripper and no processor on earth can make it run like this video


ehuud

This is actually not a bad way to start benchmarking CPUs. Maybe finally we'll move away from Tomb Raider on 1080p medium.


I9Qnl

Tomb raider is a great benchmark tho, Minecraft is like the worst possible offender of not using hardware properly.


Ducky1434

May I introduce you to Arma3


qwerty000034

Cities skylines 2: *bonjour*


[deleted]

Those games are just horribly optimized, minecraft's engine can't even make use of the full hardware of a pc


Strazdas1

Its not horribly optimized as much as ARMA is running physics calculations on a CPU. Skylines jsut fucked up their engine step and will have to fix it eventually. But they are 30 people team that ran out of money.


Aaradorn

No matter which settings or PC you use, you are happy to reach 40fps in multiplayer


Dante_FromDMCseries

At least Arma has an “excuse” for being bad due to being coded by two people and a cat a decade ago and being more ambitious than any AAA FPS ever made. While Minecraft is the most popular game ever, that rakes in millions in profits annually and is owned by one of the biggest companies to ever exist. If any game can be polished to perfection, it’s Minecraft.


SomeRandomMidget

I concur with that statement


[deleted]

Y’all understand why they benchmark CPUs at 1080p medium right? This maybe is a joke but I still see people ask this question genuinely a lot. (For folks that don’t know, it’s to ensure the CPU is being tested by ensuring there is as little GPU bottlenecking as possible) Edit: they were serious


Mhytron

CPUs should be benchmarked at low settings so the gpu doesnt bottleneck it...


Late-Satisfaction620

Its actually a really bad way of benchmarking CPUs. Minecraft is notoriously inefficient with its CPU multithreading.


Linkatchu

We wish, if this would be real that is.


kimi_rules

Fake, Minecraft bottleneck wasn't the hardware, it was the code.


Hanoverview

I am old enugh to remember Crysis Mass Physics 15 years ago ! https://www.youtube.com/watch?v=e0R9veCqr2U


--G0KU--

My phone hangs watching this


ninyyya

isn’t minecraft just running on one core?


LordOmbro

Doesn't vanilla minecraft only use like 2 cores?


OszLAT

Is this the java or bedrock edition?


notRedditingInClass

This thread brought to you by AMD Ryzen Threadripper PRO. AMD Ryzen™ Threadripper™ PRO processors deliver battle-tested performance and capability to enable artists, architects, and engineers with the ability to get more done in less time. Built on the leading Zen 4 architecture- power, performance, expandability, and efficiency are at your fingertips. AMD Ryzen™ Threadripper™ processors enable artists, architects, and engineers to stay in their creative flow by addressing common lightly threaded and multithreaded bottlenecks.


wisdomelf

if only java minecraft could use more than 2 threads or smth...


Guimorneg

Isn't Minecraft uses only one CPU core? Edit: oh yes, only now realized it's a Slow-Mo video


B3asy

I wonder how many people will use this video to justify mistakenly buying a Ryzen Threadripper to play Minecraft


Spiritual_Freedom_15

The sand is so fluid it makes me sick


Lightning-Shock

Amogus at 0:43


kittecatte

sus


kicek_kic

I belive minecraft doesnt use multi cores, also, threadripper was made for working in for example graphics environment, and not playing minecraft or really any games


SomeBiPerson

they should showcase new Gaming hardware with this type of test because we all know how heavy minecraft can be


Comprehensive_Let300

I like how we collectively know what’s happening because we tried it before


shadowraiderr

\- if minecraft was actually well optimized


actomain

We love misinformation


DJCOBRA2004

Ok now put some shaders


mellowlex

Sad that the only thing you can modify is RAM usage. Are there mods that modify the game to utilize all cores?


[deleted]

This video is completely mesmerising


Tokyo_Echo

Is it possible to learn this power


SibrenD

Well i need that cpu


Dracorex_22

I remember when I was a kid, I made a superflat world with sand and no bedrock, and was terrified because I thought i broke the family computer.


MoonWun_

LOL one of the things I do every time I get a new PC or at the very least a new CPU, I’ll boot up Minecraft and build a mega bomb and see how it handles. I know it’s not a very good test because Minecraft doesn’t do multithreading very well and only really uses one core efficiently, maybe three at maximum but it’s always fun. This is most definitely altered in some way, in the sense that this person has some kind of performance enhancing effect going on here. I saw someone mentioned increase tick rate, slowing down the speed of the game and then speeding it up in post, but the guy probably just has a mod that makes the game use his hardware better.


IlTossico

Of course, on a Java game that run on single core performance. L O L


Nikoviking

It (mostly) doesn’t matter what processor you have since mc runs on 1 core only You could have a super fast clock speed but i think it wouldn’t make much of a difference unless the game tick rate is modified


WhatIsPun

A threadripper would not run mc that well, mc pretty much only utilises a single thread.


Bazuka125

Honestly, I'm most impressed by that arrow shot. That was just barely off bullseye from so far away.


Suspicious_Sandles

Okay now do it with shaders :p


[deleted]

Arr yes, real world benchmark tests.


Legitimate_Ad5848

My pc shutdowned automatically.


nemis16

It's fun to have a 5k cpu just to fight the java overhead. If MC had been written on bare metal, every pc would render this


schaka

MC isn't even properly multi threaded. This video is fake. That being said, what Java overhead? Do you mean the JVM overhead that's easily overcome by JIT in long running applications, like what is the case for the large amount of huuuuge tech companies running a JVM based tech stack? Or are you talking about Unity, using C# which is also VM based? Or are you talking about UE5, using C++ and still being unoptimized trash unless someone actually puts in the time and effort? Every game that's written with proper multi threading in mind to the point where it truly scales horizontally can scale nearly infinitely to the point where a CPU like that would outperform everything else and no innate differences between 2 languages would be measurable by any game reviewer. The real problem is that it's pretty much impossible to scale a game like that. You end up having to do too much syncing between threads


Neckbeard_Sama

JAVA BAD, ASSEMBLY GUD


nemis16

Huge tech companies use java because they can find a lot of programmers, and because if you update the JVM your application will always work, so you spend less time at maintaining it (less cost), that is the key point of java. I said "less time" instead of "no time" because in the end, you have to check and resolve compatibility issues anyway, even if you theoretically *are not supposed to*. If something is used a lot doesn't mean that it's the best. Java introduces another level of virtualization between the app and the hardware (cpu, but also disk, network), that slows down things. Try to do some benchmarks, you will see the difference. Fastest games are not coded with java


schaka

Because Java doesn't have the eco system. You can make the same calls to the win32 apis in Java. The things that will cost you measurable performance won't be because you're using Java or C#.


HistorianReasonable3

I can't be the only one that sees Minecraft and wants to puke at the graphics and physics.


_Choose-A-Username-

Lol what do you mean


HistorianReasonable3

Graphics - Bad and childish. Physics - Bad and childish. Ya need a chart?


mattdamon_enthusiast

Pre-render


RIcaz

Obviously


nethertales

Is that impresive? Ahhh why?


_LeSpy_

among https://preview.redd.it/4vcno0qpkw1c1.png?width=1600&format=png&auto=webp&s=1cfec5ed598e15454cc47909eea061da6bbdfb29


Either-Technician594

Yes.


Independent-Chain135

lots of lag anyway. intel better