This is a question has been bothering me as someone who’s country was colonized by the British Empire. We were taught about it in schools and how it lost power over time but never how the USA came to take its place especially over such a short compared to the British Empire.
More that as independence movements grew, and the economy shifted to valuing high value manufactured goods colonies became more expensive to hold and less profitable.
Combine that with the anti Nazi rhetoric, it made holding colonies militarily much harder to do. Especially as Britain, unlike France, has never really seen itself as a military force.
We much prefer to control the seas and play off local groups against each other and help the preffered side to come out on top and work as a local ruling elite.