The Power and Potential of Epigenetic Aging Clocks
Complete the form below to unlock access to ALL audio articles.
The mechanisms behind how we age have been of interest to researchers for decades, along with an interest in how we measure aging. The National Institute on Aging (NIA) highlighted the importance of understanding the dynamics of the aging process in its Strategic Directions for Research for 2020–2025, as well as underlining the need for interventions that reduce the burden of age-related diseases such as cancer, stroke, Alzheimer’s disease and many other conditions.
What if we could predict whether someone was likely to develop an age-related disease later in life? This might enable people at a higher risk of developing an age-related disease to receive intervention earlier, keeping them healthier for longer. Will epigenetics help in such a quest?
Epigenetics describes the impact of chemical modifications on our DNA that alter gene expression. While several different epigenetic modifications exist, DNA methylation is the most studied and therefore the best understood. “DNA methylation [is] a chemical modification to our DNA that doesn’t alter our genetic code but can act a bit like a dimmer switch to turn genes off and on,” Riccardo Marioni, professor of epidemiology and biostatistics at The University of Edinburgh, explains. Methyl groups are added to the cytosine residue in DNA by enzymes known as DNA methyltransferases, which act to repress DNA transcription. In 98% of cases, methylation occurs at positions in the genome where a cytosine residue is linked to a guanine residue by a phosphate (known as a CpG).
What are epigenetic aging clocks?
Methylation marks across our genomes change as we age. For example, a CpG moiety could have a high chance of being methylated in a 20-year-old, but methylation at this locus might decrease over time. At other loci, the opposite could be true; younger people may have less methylation at that CpG, with the chance of a methylation mark being present increasing with age. “There is loss of methylation due to aging and gain of methylation. Statistically speaking, it’s a linear regression model,” explains Steve Horvath, professor of human genetics and biostatistics at the University of California Los Angeles (UCLA). This loss and gain mechanism has been harnessed to develop “epigenetic aging clocks”, the very first of which was created by Horvath
Epigenetic aging clocks analyze the composition of methylation marks across the genome as a biostatistical measure of “the change in the state of a biological system over time,” according to Dr. Morgan Levine, principal investigator at Altos Labs, who also developed an epigenetic aging clock as a postgraduate. However, she believes the term “clocks” can be misleading, as the methylation changes measured “don’t occur at a set frequency (like a physical clock) and are much less deterministic than a real clock – there is no lifespan countdown happening in our bodies.”
The first epigenetic clocks created could be used to infer calendar age – from the cradle to the grave – but over the past decade, researchers have been developing different clocks that can also be used to predict
Chronological vs biological age
Chronological age is the time that has passed in years since a person’s birth, whereas biological age is a measure of accumulated DNA methylation, other epigenetic changes and cell functionality. Chronological aging occurs at a set rate in all people, but biological aging is heavily influenced by factors including disease, lifestyle and genetics.
Examples of existing second-generation epigenetic aging clocks include PhenoAge and GrimAge, the latter of which is “named after the Grim reaper,” says Horvath. He explains that the various iterations of epigenetic clocks “differ in the logic with which they were constructed”, although he admitted that an “empirical finding” was that GrimAge provides a better measure of disease risk.
The clocks work by using arrays to measure methylation in prespecified locations in a DNA sample, with current arrays providing information on ~850,000 unique locations. Machine learning algorithms are then used to select which locations are most informative and calculate a biological age estimate based on these. The clocks usually use information from between 300 and 1,000 CpGs.
Depending on the type of clock, methylation at different CpGs can be used for the age estimate, and specific loci can be weighted differently by the clock's algorithm. As a result, some clocks are considered “better” for certain applications by experts, although Levine cautions that “we fundamentally don’t know why the different sets of methyl group locations in each clock have different biological relevance”, meaning we don’t have a complete understanding as to why one clock may be “better” than another.
New iterations of epigenetic aging clocks are constantly being developed, with the latest version of GrimAge, released in late 2022, showing improved performance over the original when used to predict mortality risk across multiple ethnic groups.
Applications of epigenetic aging clocks
Horvath describes these clocks as not only aging clocks but also “life course” clocks that link developmental processes to aging, and although not currently in clinical use, epigenetic aging clocks can improve our understanding of the molecular mechanisms that drive aging, or the cellular pathways that are affected as we age.
In Marioni’s opinion, the most likely clinical use of epigenetic aging clocks would be in a precision medicine capacity, to identify individuals most at-risk of developing a specific disease based on their epigenetic profiles. This information could then be used to tailor treatments, for example.
A recent study published in Aging Cell highlights a specific condition where epigenetic age could be applied in this context. Clonal hematopoesis of indeterminate potential (CHIP) is a precursor to leukemia, where cancer-associated somatic mutations are present in blood or bone marrow cells. It affects around 10% of 70–80-year-olds, in which 0.5-1% of cases will progress to leukemia.
Carriers with a high epigenetic age acceleration value showed a higher risk of all-cause mortality, compared to those with a negative epigenetic age acceleration score. This indicates that epigenetic age could be a valuable tool for identifying the CHIP populations in greater need of clinical intervention.
Could epigenetic age predict cognitive decline?
In 2019, a study led by Dr. Jan Bressler, assistant professor of epidemiology, human genetics and environmental sciences at the University of Texas, established that accelerated epigenetic aging was associated with poorer cognitive performance in a cohort of middle-aged African American participants from the Atherosclerosis Risk in Communities (ARIC) study. The results were then replicated in European participants of the Generation Scotland: Scottish Family Health Study, and in European American participants of the ARIC study. This work used the Horvath and the Hannum DNAm aging clocks and, although accelerated aging was correlated with a poorer cognitive performance, Bressler and colleagues found no significant associations between age acceleration and changes in cognitive performance over a six-year period.
Marioni found a similar result in an analysis of the Lothian Birth Cohort 1936, who were in their 70s – a greater epigenetic age acceleration measured using Horvath’s DNAm clock was associated with poorer performance on cognitive tests, but age acceleration couldn’t be used to predict cognitive performance over a short follow-up period.
More recently, a study from Northwestern University, published in Aging, explored the associations between blood-based epigenetic aging, neuroimaging-based brain aging and cognitive function. The study aimed to evaluate the utility of both approaches, classed as “novel aging biomarkers” to predict cognitive function, which could be a useful biomarker of dementia.
The researchers took a cohort of middle-aged adults participating in the Coronary Artery Risk Development in Young Adults (CARDIA) study, a longitudinal project that commenced in 1983. The researchers measured the participants’ epigenetic age using a variety of clocks, including Horvath’s DNAm Age, Hannum’s DNAm Age, GrimAge and PhenoAge. Over a period of 5–15 years later, the participants’ cognitive abilities were measured, and magnetic resonance imaging (MRI) brain scans were obtained.
“DNA methylation was measured at earlier visits before brain MRI because molecular changes could occur years before the brain structural changes,” the authors write in the paper. Machine learning was used to calculate a “brain age” for each participant, based on the MRI scans. The two aging markers were then associated with participants’ performances on the cognitive tests.
The researchers found that accelerated epigenetic age was predictive of cognitive performance 5–10 years later, but only when one specific clock was tested: GrimAge, which was significantly associated with cognitive outcomes.
A higher GrimAge acceleration score, or a higher brain age than chronological age, was associated with a poorer subsequent cognitive performance. GrimAge and brain age were weakly correlated, indicating they provide complementary information relating to cognitive performance, and combining information from the two novel aging biomarkers provided a more accurate predictor of cognitive decline later in life.
“Epigenetic age is a relatively stable biomarker with strong long-term predictive performance for cognitive function, whereas a brain age biomarker may change more dynamically in temporal association with cognitive decline,” says Yinan Zheng, assistant professor of preventative medicine at Northwestern University and author of the study.
A concern when looking at epigenetic aging, according to Marioni, is that “typically, we see different DNA methylation profiles in different tissues.” Whether patterns of methylation aging in the blood reflect those in the brain is unclear, he adds.
Using a blood-based and a brain-based measure of aging, as the Northwestern University researchers have, could ensure blood-based epigenetic changes are reflective of changes to other biological systems in future studies.
Epigenetic clocks in anti-aging research
Beyond their utility in predictive medicine, epigenetic clocks could also be used as a “surrogate endpoint” to assess drugs or other anti-aging interventions in trials, preventing researchers from having to wait decades for results, Levine explains.
In geroscience studies, different types of biomarkers exist:
● Response biomarkers, which change as a result of an anti-aging intervention
● Predictive biomarkers, where a change in the level of the biomarker is associated with a health outcome (these are also known as prognostic biomarkers if they predict a health outcome related to a condition with which a patient is already diagnosed)
● Surrogate markers, where the treatment-induced change in the biomarker is associated with a health outcome, achieved via the same biological mechanism as the anti-aging treatment works.
Geroscience is a field of research that aims to understand the genetic, molecular and cellular mechanisms that make aging a risk factor and driver of conditions and diseases of older people.
Let’s say that, hypothetically, reducing epigenetic age is scientifically proven to increase health span. If an anti-aging intervention was developed that also functioned by reducing a person’s epigenetic age (for example, by altering the methylome), we could assume that the drug therefore increases health span. Rather than waiting to see how long the individual lives for, epigenetic aging clocks could be used as a surrogate endpoint in a clinical trial testing that intervention.
However, in their paper published in GeroScience, Steven Cummings, professor of medicine, epidemiology and biostatistics at the University of California, and Stephen Kritchevsky, professor of gerontology and geriatric medicine at Wake Forest University, point out that, as epigenetic aging clocks measure the impact of numerous biological mechanisms of aging, they may be less sensitive to the impact of a specific anti-aging intervention that only affects one biological mechanism of aging.
Can we expect epigenetic clocks to be brought to the clinic?
Although some companies that provide epigenetic age tests exist, Horvath cautions that epigenetic age does not have clinical utility yet, and such testing is “more for fun”. “People shouldn’t think that epigenetic age replaces standard clinical biomarkers [such as blood pressure or glucose levels],” he says. He worries that, due to a complicated non-linear relationship between a person’s epigenetic age and mortality risk, people may misinterpret their epigenetic age estimate and how this relates to their lifespan. However, “while [epigenetic aging clocks] are not able to diagnose diseases, they do provide insight into overall health status of an individual,” says Levine.
“The dream is that these methylation biomarkers become so valuable that they could be used in an annual physical examination,” Horvath continues. In theory, if a person’s biological age (based on methylation markers) was elevated, a doctor could prescribe an anti-aging therapeutic to slow or reverse the rate of epigenetic aging. “Such an intervention doesn’t exist yet, but that’s the dream,” Horvath says.
Limitations of epigenetic clocks
One of the biggest limitations of epigenetic aging clocks is the lack of mechanistic understanding, according to Levine. She questions, “Why do certain CpGs change methylation status with aging? How do the changes affect the functioning of cells and tissues? What is reversible, and how?” Right now, we fundamentally don’t know. In a similar vein, we also don’t know whether an intervention that reverses a person’s epigenetic age will produce a commensurate change in health or lifespan.
An additional bottleneck is time. Current studies assessing epigenetic aging as a biomarker for disease only span 20–30 years. It can’t be said with certainty that a person’s epigenetic age at 20 will predict their likelihood of getting a disease at 70. Horvath, however, is optimistic. He hopes that epigenetic aging clocks could one day be used for risk stratification in children, or even at birth.
Epigenetic aging clocks are also considered “noisy”, says Levine: “You can get vastly different age estimates when assessing the exact same sample twice,” she explains. Levine and her team therefore created a statistical method to reduce the mechanistic noise of epigenetic clocks, although, “often, people just apply the original noisy methods [of analysis],” she says.
When it comes to using epigenetic aging clocks to find and test anti-aging interventions, there are also regulatory barriers to consider. “The FDA doesn’t recognize aging as a disease”, explains Horvath. “It makes it hard for the biotech industry to develop drugs...and find investors.”
He also points out that, in order to develop
Levine also has similar viewpoint on the prospects of the epigenetic aging field. She envisions that “in the future, there will be more dimensions of aging that we can quantify simultaneously”, with this being “critical” to quantifying aging across cells, tissues and organisms.