Would you say that the West Coast of the United States is greener than the East Coast?

2

Answers


  1. 0 Votes

    The West Coast is known for that “healthy” lifestyle that you see on television. People are shown to eat healthier and be more athletic, and the same is true for environmentalism. Being “green” became a trend here due to celebrity influence which originated in Hollywood, and there are also many organizations dedicated to saving the environment. However, it’s important to note that being green isn’t just a West Coast thing, it’s just much more prominent there. 

  2. 0 Votes

    These are just generalizations, based on my own observations and beliefs. The urban landscape in the East Coast tend to be denser, so there is less dependence on cars and better public transportation systems. The West Coast is gifted with a lot of beautiful natural landscape, so people are more appreciative and admiring of nature. People tend to be more outdoorsy, so being green is more of a priority for them. But overall, its hard to say which region is greener, it all depends on the individual. 

Please signup or login to answer this question.

Sorry,At this time user registration is disabled. We will open registration soon!