The phrase
"employer mandate" refers to a rule or requirement that employers must provide certain benefits or coverage, typically related to healthcare, to their employees. Employers have to comply with this mandate, ensuring their workers have access to these benefits.
Full definition