"Earth creationism" refers to the belief that the Earth and everything on it, including humans, were created by a divine being or higher power. It suggests that all life forms and natural features on Earth are the result of a deliberate act of creation, rather than gradual development or evolution.
Full definition