The phrase
"faith healing" refers to the idea that a person's faith and belief in a higher power can help to cure or heal their physical or emotional ailments. It involves relying on spiritual or religious beliefs as a source of healing rather than traditional medical practices.
Full definition