Various studies (conducted by researchers at Anglia Ruskin University, University College London and Perdana University in Malaysia) have been looking at just this, focusing on how green spaces can affect the way we think about our bodies. For example, British students were shown photos of either natural places or more built-up areas, and scientists found that those who looked at the leafy areas reported a more positive body image, whereas the other group did not.

Another study had participants walking about in either a natural setting or an urban one. Once again, it was found that people who were exposed to greenery had a significantly higher appreciation for their body - in comparison to those who were in the city.

The findings make logical sense. Cities can be stressful places, where you're bombarded with advertisements that push an unrealistic standard of beauty onto you. It's perhaps no wonder that you soon begin to be hypercritical about your own body.

Compare this to spending some time in the park. Not only do you feel so much more peaceful, but this positive state of mind is transferred onto how you feel about your body.

The paper mentions this idea of nature as being "restorative". This means that it can help wipe away negative thoughts about your appearance, whilst also offering the opportunity to distance yourself from appearance-obsessed urban spaces.

Away from the claustrophobia of cities and towns, you get the opportunity to appreciate your body for all it does for you, as opposed to judging it harshly on aesthetics.

Add this to the host of other benefits that being amongst nature has, from improving your sense of community and safety to helping your mental health, and it's pretty clear we all need some green time.

In one of the studies, it's worth noting that if the green space is crowded with other people, the improvement to your body image might not be as great. This is probably because participants became annoyed with the amount of people around, and may have experienced some negative social encounters.

The studies focused on the short-term, positive impact that nature can have on body image. The longer-term effects are unclear, but if regular exposure to greenery helps foster a lasting positive relationship with your body, that would be pretty amazing.