Just about every country has the know-how to make features down to about 30-40 nm.
The prohibitive part is the cost of building such foundries, and the amount of defective chips you are willing to put up with.
Over the last 50 years, growing pure Silicon crystals, from which the die discs are cut, has become an art.
And the cost of setting up a foundry and clean lithography facilities has become astronomical.
IIRC, AMD built their own foundry in the early 2000s in Germany, to compete with Intel.
They quickly found it uneconomical, spun it off as Global Foundries, and went back to using TSMC, along with nVidia and Apple.
And while IBM maintains its own research facilities, I don't think they fabricate any more either.
Once you get down below 30-40 nm, things become a little more complicated.
Proprietary techniques are utilized, which sometimes blur the line between feature sizes, or make for an 'apple to orange' comparison.
For example 3D features, and mixed feature size, made Intel 14 and 10 nm chips comparable to TSMC 10 and 7 nm chips.
But since the 'general impression' is that power consumption and number of transistors are related to feature size, being able to claim smaller features is a big marketing advantage.
TSMC is already advertising what they call 5 nm chips and probably sampling 2 nm chips.
Keep in mind that the electron's wavelength is in the pm range, so once features get below the nm range, you may have electrons tunneling between 'features', making such chips worthless.