In fact, the overclocking process has changed a lot over the past 10 years, and all these horror stories are drawn from the bearded times, when to overclock some components you had to climb into the computer almost with a soldering iron, and the success of such operations equally depended on the directness of the hands overclocker and good luck.

Acceleration is as complex as origami, symbolist poems and quantum physics


Perhaps the most ancient and completely outdated myth. From a technical point of view, humanity has taken a big step forward, we are switching to renewable energy sources, taking control of AIDS and designing neural networks that write songs and prose. It would be strange if, in order to overclock a computer, we still had to act the old fashioned way, rearranging jumpers on the motherboard or manually setting the desired processor performance in an antediluvian, command-line-like BIOS. Therefore, it is not surprising that the current overclocking process has become much more understandable and friendlier, and even a child can do this. In fact, the whole process comes down to launching a profile utility for overclocking a video card (Afterburner, GPU Tweak, etc.) or a processor (CPU Tweaker, Intel XTU, Ryzen Master, etc.), after which a simple settings menu will open in front of us, in which you will actually need to prescribe the necessary settings and it's done. Most of these utilities have reliable foolproofing; you will not be physically allowed to set some strange parameters that will burn everything to zero. If you don’t like such one-button overclocking, then modern BIOSes with friendly graphical interfaces can guide you by the hand through the whole process and will signal if the set values \u200b\u200bare dangerous for the system.

Overclocking RAM does nothing.


Having heard this phrase 10 years ago, most members of the editorial board would nod their heads in agreement. Indeed, playing with timings and voltages made a beautiful picture in benchmarks, but had little effect on reality. Today, the speed of data exchange between the processor and memory for many modern tasks significantly affects the performance of the system. For example, the first generations of Ryzen suffered from an unsuccessful memory controller, so the operating frequency of the RAM in their case directly affects the performance of the CPU. At Intel, the impact of RAM on performance is not so pronounced, but it also exists. Our tests of the effect of RAM frequency on system performance showed that in games, archiving and encoding tasks, base memory with a frequency of 2666 MHz is 5 - 10% slower than its counterpart operating at 3200 MHz. And this gap is increasing when compared with the bars at 3600 and 4000 MHz. And that's a significant performance boost. Or here is a good example (https://www.overclock.net/media/no-title.4495631/full) of the impact of memory speed on FPS in the third Witcher.

RAM heatsinks are a must have, without it everything will burn and Cthulhu will wake up


Have you seen the show “Adam Ruins Everything”, in which the main character-leader tramples on the myths familiar to us, proving that the traditions of magnificent weddings are imposed on us by marriage agencies, and the wake for 300 people is by funeral homes? If this text were a script for this show, then Adam would take on this myth with particular pleasure. In fact, this idea is actively promoted by marketers who are trying to sell us the same bars, but at a higher price. As a result, on sale you can find such options with ordinary foil instead of a radiator. And you know what? They work great. A radiator is needed only in cases of really serious overclocking, when all the juices are naturally squeezed out of the bar, which is why the memory chips work under maximum load.

Overclocking is risky, you can burn the motherboard or disable the processor


To overclock the ancient Pentium 4 Northwood, you just had to increase the operating voltage in the BIOS to 1.7 V, then toss a coin and hope for success. Many "stones" of the series quickly failed in this mode, and this phenomenon became widely known under the name Sudden Northwood Death Syndrome. But that was in ancient times, at the dawn of overclocking, when you could hit the jackpot and activate locked processor cores, or you could burn everything to hell. Now overclocking is what is called gentle, legal and without strain. At best, you'll squeeze an extra 10% out of your CPU cores, and overclocking utilities will walk you through the process step by step.

The motherboard does not affect overclocking


Yes, the motherboard does not directly affect overclocking. Actually, there is nothing to overclock. Then why do they buy expensive motherboards on the flagship Z490 or X570 chips for gaming computers, if you could plug the simplest board into the case, and put the rest of the money on a video card and a Cyberpunk 2077 collector's edition? And the thing is that the current processors have become natural monsters, 4 cores in a household computer no longer surprise anyone, and massive 16-cores like Ryzen 7 5800X are on the horizon. All this has led to the fact that in overclocking these kids can consume 200 - 250 watts of energy. Hence the increased requirements for the power subsystem of the motherboard, which must properly cope with such power consumption and at the same time effectively cool the power circuits. With a weak power subsystem, the conditional Core i7-10700 will not only not overclock, but simply will not be able to keep the declared turbo-boost frequencies. That is, in fact, it is the motherboard that plays the role of a bottleneck during overclocking. And this is not to mention the fact that many motherboards based on entry-level chipsets simply do not support overclocking of the processor and RAM.

Components fail faster when overclocked


No one has conducted large-scale studies on this topic, and a single experience is a so-so indicator. According to the laws of physics, this is quite logical, if some mechanism or electronics is constantly working under maximum load, then it wears out faster and becomes unusable. This is true for RAM memory chips, processor core chips, and video core chips. The problem with this thesis is that we cannot adequately measure how much work at the limit affects the wear of a particular component. And how exactly to find this limit? Considering that there are very few detailed stories on the network (as well as statistics confirming this from service centers), we can conclude that the problem is far-fetched.

By buying a low-end model and overclocking it, you will reach the level of an older model


Not really. In the early days of overclocking, it was like the wild west, where adventurers flocked in search of easy, fast cash. In those days, processors could be overclocked by a quarter of the initial speed without increasing the voltage, and if you increase the voltage and earn the favor of luck, then the stone could speed up almost one and a half times. At this point, you need to press F to salute the old Celeron 300A and Core 2 Duo E7200. Or remember the Athlon XP on the Thorton core, which could double the L2 cache. And the Phenom II X2 5xx, with the right luck, could unlock the cores and go from dual-core to quad-core. This was mainly due to the fact that the manufacturer was trying to maximize profits, so it sold the rejected upper-tier processors under the guise of entry-mid-level models with locked cores. So it turned out that in fact you were buying a lottery ticket, which, with some luck, could turn a pumpkin into a carriage. Now such tricks of generosity are almost never found, so you won’t be able to pump the GTX 1070 into the GTX 1080, no matter what cool cooler you put in and no matter how much additional power you put in.

Overclocking leads to instability


A computer can crash, fail and work unstably without any overclocking, this is not news. The fact is that semiconductor crystals of central processors or video card memory can hardly be called 100% stable: with any changes in frequencies and voltages, some errors occur in it, which in most cases it solves itself. If you haven’t decided, then we welcome the blue screen even on default settings. Therefore, overclocking rather enhances this instability, but does not bring anything new. Actually, that's why there is a legion of various stress tests like AIDA64 and Prime95, which help to quickly find and check for stability the optimal frequency and voltage values at which the system will not fall.

For overclocking, you need to change the entire cooling system


This is both true and not true. Just as a drug turns into a poison when the dose is increased, overclocking can affect the system in completely different ways. As practice shows, an average video card with an adequate cooling system allows you to squeeze an additional 4-6% out of the core. In some cases, like with the Zotac GeForce GTX 1050 Ti and a bunch of other mid-range graphics cards, the performance increase can be as much as 8-10%. Everything will depend on the components themselves. For example, the latest generations of Ryzen processors are already overclocked to the very worst, so you need to get a stable turbo boost for most cores rather than trying to squeeze another 300 MHz out of it. Old ryzens, on the contrary, overclocked quite well even with stock Wraith Spire coolers. In general, for moderate overclocking, in most cases, high-quality stock cooling is enough. Another question is how it will make noise. But if you need extreme overclocking, then welcome to the world of elite tower coolers, water cooling and Thermal Grizzly thermal paste. However, that's another story.

As an afterword


As you can see, overclocking then and now are two different things. Today it is simpler and safer, but with less potential winnings. And everyone has long decided for himself whether it is good or bad. From the point of view of a simple user, getting an additional 5 - 15% performance boost by making a few clicks in Afterburner is a nice bonus to buy.