derailed the thread, so it's no longer off-topic
not sure, because most of my life i've lived in the southeastern US, mostly Georgia and Florida. but my mom lived in LA years ago, and she said that in more affluent areas, people are pretty health-conscious and stuff - basically the closer you are to the beach, the "healthier" people are (or seem to be at least).



Reply With Quote