The phrase
"to destroy the planet" means to cause so much damage and harm to the Earth that it becomes uninhabitable for living beings. It refers to actions that can severely impact the environment, ecosystems, and climate, ultimately leading to the endangerment or extinction of various species and the overall degradation of our planet.
Full definition