People who use sunscreen often receive less protection than they expect, researchers warn today.
Professor Antony Young of King’s College London, UK, investigated normal use of sunscreen in a study of 16 fair-skinned men and women. The group was split in half, and were given different ultraviolet radiation exposures.
One group received a single ultraviolet radiation exposure – to simulate sunlight – onto areas of skin treated with factor 50 SPF sunscreen, ranging in thicknesses from 0.75mg, 1.3mg, or 2mg (the recommended amount) per square centimetre. The second group were exposed on five consecutive days with varying ultraviolet radiation levels, to mimic typical holiday exposure.
Skin biopsies were analysed, showing considerable DNA damage on the areas that received less than the recommended sun protection, despite the ultraviolet radiation dose being very low.
Analysis also showed that five days of exposure to high dose ultraviolet radiation with the sunscreen at 2mg caused significantly less damage than just one day’s low ultraviolet radiation dose exposure without sunscreen.
The researchers point out: "It is well known that people don’t receive the full ultraviolet radiation blocking benefit of sunscreen, because they are applying it more thinly than manufacturers recommend."
Details appear in Acta Dermato-Venereology today (25 July).
The authors state: "Results showed that sunscreen with a sun protection factor of 50, applied in a typical way, would at best provide 40% of the expected protection. We suggest that consumers use a much higher SPF sunscreen than they think necessary, to ensure they’re protected from sun damage."
Professor Young said: What this research shows is that the way sunscreen is applied plays an important role in determining how effective it is. Given that most people don’t use sunscreens as tested as tested by manufacturers, it’s better for people to use a much higher SPF than they think is necessary."
Young, A. et al. Acta Dermato-Venereology 25 July 2018
Leave a Reply