The Beach Makes You Healthier

I was out in California last week visiting friends and eating barbeque, when I overheard someone talking about health. They were pretty emphatic about the fact that “anyone who lives by the beach is healthier”, and it gave me pause to think. Was there any truth to that?

 

I mean sure, you could say that the tan would be better. But one clear negative of living by the beach is how hard it would be to keep your feet soft and fresh. (One hour near the sand and it looks like I’m one quarter reptile.) Nevertheless, I think the point here is that it’s important to live where you’re happy, and if happy means healthy, then double down and stay there forever.

Add Your Comment

Click here to log-in now and post a comment.

Register 1 Register 2 Register 3 Register 4 Register 5