Overclocking is primarily a game of heat management. Stability isn't something that is going to affect you in the long term really. If it runs stable for a week, it should be fine. Tweaking voltage helps with some stability issues.
A few things about overclocking:
Non-binned CPUs are normally better. If you can identify one of those, they tend to live a longer, more stable life.
Intel CPUs really need the K suffix to be truly overclockable.
There is a limit. It's not a case of "If I pump enough cooling into it, I can throw it to a billion Ghz". There will be a point where the CPU just won't run stable, no matter how much power and cooling you pump into it.
It will void your warranty unconditionally. Even if they market it as "Overclocker friendly", it's still gonna void the shit out of your warranty.
It will decrease the lifespan of your CPU. The extent depends on your level of overclock. It's not gonna be like from a 20 year lifespan to 4. It's more like if you run it 24/7 at a good overclock, a number somewhere along the line will decrease.
Do it if you need too. For me overclocking is a case of I either have tons of money to piss away (I wish I had this problem) or a case of future-proofing. I suggest you get a good system ready for overclocking, and wait until you come to a point where you can't do what you want to do on your machine anymore with your current CPU. At that point, overclock. It's not a game of just gonna throw the specs up a bit, no biggie. Regardless of what people say there are consequences. You are literally operating your machine outside of it's designed specifications, even if it's designed to go out of those specifications.
If you have a stock cooler that came with your CPU, forget about it. If you have an air cooler, be very careful and monitor temps like you would monitor your family in a bad Detroit neighborhood. While having an unstable OC is usually reversible, a 110c OC is not usually reversible. The rule of thumb is never go over 80 @ full load for extended periods of time. Spikes to 90c are alright, but go past that and you are living on the edge. 100c is normally the danger zone for good Intel CPUs, and if you see 100c on your silicon, cut the power, and cut your OC. Modern CPUs aren't as dangerous to overheat as older ones. It used to be that if you were to pump the multiplier too high, you would pop your CPU, but today you can normally be safe with throttling and emergency cutoffs. It's still never a thing to rely on.
Ask people. If you're unsure about OC, and experimental and strange computer configurations in general, ask people. Waiting a day to get a second opinion can mean the difference between busted box and busting box. Nobody is an expert here, not even the people who made the chips. By overclocking you are cementing your feet firmly into hobbyist territory, and we hobbyists almost always ask other hobbyists. It's just the smart thing to do. Follow common sense rules, don't make impatient actions, if it's too good to be true then it is, don't act like you know everything.
I have never overclocked anything in my entire life. All I have done is gather information for the day that I say I am too far behind, I need to put the hammer down, and give my chip a bit of a boost.
To answer the OP's question in my own words, the higher you go on the clock speed, the higher you go on the power and heat. You can see this through the evolution of PC power consumption and cooling. Up until the DX4 line of 486 chips, CPUs were rated for no-heatsink operation. You went from smaller, 8-10 fin heatsinks with a small fan on Socket 3 and 4 machines, to slightly beefier coolers on Socket 5, 7, and 8, and when the line got to Slot 1 and Socket 370, you started to see block coolers, with later sockets like the first Intel LGA sockets and AM2 having larger and larger coolers.
Today AMD has sold chips that require water cooling because with their limited architecture (Before Zen) they had to kick the speed up to stay even remotely relevant. This is where you get the old joke "AMD CPUs are space heaters"
Of course transistors have gotten smaller and more efficient, but the higher you go on clock speed the higher you go on heat and power. With more cores it's somewhat different. Of course it's more heat and power, but not to the extent of clocking higher. This is why most of the Xeon chips have loads and loads of cores at slower clock speeds with fairly moderate TDPs, all on air cooling.
Nobody knows what is going to happen. You can theorize all day, but at the end of said day, there will always be another person, back in history, saying a very similar thing, to be proven wrong later. People for the past thousand years have been making these predictions, and each and every one of them are wrong today. I am willing to believe history in that respect will repeat itself, but I too can be proven wrong (It's quite annoying actually). There may be new transistor tech, there may be new architecture tech. We may fundamentally re-envision the entire concept of computing at it's very core, but nobody knows as for yet.
My personal opinion is that there will be a way around. There always has. Stopping at a wall just gives time for people to work around it.
Have a good day.