"Feminist art" refers to artworks created by women artists that aim to highlight and challenge gender inequality and promote women's rights. It often explores themes such as body image, reproductive rights, and women's experiences in society. This genre of art seeks to inspire social change, empower women, and challenge traditional gender roles and stereotypes.
Full definition