The West Coast is known for that “healthy” lifestyle that you see on television. People are shown to eat healthier and be more athletic, and the same is true for environmentalism. Being “green” became a trend here due to celebrity influence which originated in Hollywood, and there are also many organizations dedicated to saving the environment. However, it’s important to note that being green isn’t just a West Coast thing, it’s just much more prominent there.
These are just generalizations, based on my own observations and beliefs. The urban landscape in the East Coast tend to be denser, so there is less dependence on cars and better public transportation systems. The West Coast is gifted with a lot of beautiful natural landscape, so people are more appreciative and admiring of nature. People tend to be more outdoorsy, so being green is more of a priority for them. But overall, its hard to say which region is greener, it all depends on the individual.
Click here to cancel reply.
Sorry,At this time user registration is disabled. We will open registration soon!
Don't have an account? Click Here to Signup
© Copyright GreenAnswers.com LLC