The phrase
"left coast" refers to the western coastline of the United States, specifically the states of California, Oregon, and Washington. It is called the
left coast because these states are located on the left side of the country when looking at a map.
Full definition