The term feminism refers to a social, political and economic movement that advocates for equal rights, opportunities and treatment for women. It seeks to challenge traditional gender roles and promote equality between men and women in all aspects of life including education, employment, politics, and family relationships. A feminist believes in the power of women and their ability to contribute positively to society, and works towards creating a more equitable world where everyone is valued and treated with respect.