Measurable (also known as minimal) residual disease (MRD) is an established component of acute lymphoblastic leukemia (ALL) management, and expert guidelines include assessment of MRD as the standard of care following induction and consolidation therapy, as well as at additional time points, depending on the treatment regimen administered.
Although the utility of MRD in ALL has been well defined over the last decades, several questions remain. In this article, the authors review the use of MRD in ALL, addressing the topics of the increasing sensitivity of MRD detection, MRD surveillance, peripheral blood versus bone marrow assessments, and central nervous system (CNS) analysis.
Depth of MRD Assessment and Association with Outcomes
Previous clinical practice guidelines employed MRD to describe detectable leukemia below the traditional remission definition of 5% blasts via morphological assessment. However, with increased sensitivity of modern MRD detection assays, it is now generally recognized that a sensitivity of at least 10-4, or 0.01% leukemia cells in the bone marrow, is required for adequate assessment of ALL MRD.
The main advantages and disadvantages of MRD assessment methods, including multiparameter flow cytometry (MFC), allele-specific oligonucleotide (ASO)-based quantitative polymerase chain reaction (qPCR), and next-generation sequencing (NGS), are summarized in this article (see TABLE 1).
While high-cost NGS-based MRD detection, which achieves a sensitivity of 10-6 (0.0001%), has been increasingly utilized in clinical practice, the clinical significance of increasing the depth of residual disease below 10-4 remains unclear. Recent studies suggest that highly sensitive MRD likely offers value in certain clinical settings, but further research is required to more clearly ascertain the optimal MRD thresholds across therapies and patient settings.
Post-Remission MRD Monitoring
Although MRD-negative response to therapy generally portends superior outcomes, up to 25% of ALL patients who achieve initial MRD-negative results will subsequently relapse, with the majority of clinical relapses occurring within the first three years following diagnosis.
There are currently no formal guidelines on frequency or duration of MRD monitoring; clinicians may thus perform strict MRD monitoring every few months, forego further MRD assessments in patients with early MRD response, or employ any strategy in between. In the authors’ practice, for patients who achieve MRD-negative remission following induction, they evaluate NGS MRD monthly until MRD negativity is attained and then just prior to allogeneic hematopoietic stem cell transplant for adults. For patients receiving a pediatric-based regimen, NGS MRD is monitored after induction, consolidation and delayed intensification, and then every three to six months while on maintenance therapy.
The limited data currently available suggest that serial MRD monitoring may offer prognostic value and, in the current age of available MRD-targeted therapies, an opportunity for early intervention. Future work should focus on the optimizing the frequency and duration of post-remission monitoring.
Peripheral Blood as a Source for MRD Assessment
In ALL, MRD assessment had traditionally required repeated bone marrow aspirations, which are invasive, expensive, and time-consuming. With the advent of more sensitive MRD technologies, MRD monitoring using peripheral blood is increasingly being studied as an acceptable, and perhaps preferable, alternative. In one recent study, serial MRD monitoring of the blood allowed for early detection of clinical relapse and resulted in early intervention, with subsequent MRD clearance in some patients.
MRD detection using peripheral blood must be performed with an appropriately sensitive assay. For example, several investigations compared the sensitivity of MFC and qPCR in peripheral blood versus bone marrow. Peripheral blood samples from B-ALL patients exhibited MRD rates that were three orders of magnitude lower relative to bone marrow, indicating the importance of marrow assessments when MFC and qPCR are used to track MRD.
The authors’ approach involves monitoring peripheral blood NGS MRD every two to three months in the first year following transplant or chimeric antigen receptor T-cell therapy, with additional NGS MRD bone marrow evaluation at two months post-treatment and when clinical or laboratory concerns warrant bone marrow examination.
MRD and CNS Leukemia
Identification and prevention of CNS involvement is a critical element of ALL management. Among ALL relapses, approximately 20% to 40% occur in the CNS, either isolated or combined with bone marrow involvement. Applying MRD to standard cerebrospinal fluid (CSF) analyses may offer additional clinical value and has been the subject of several studies in pediatric and adult ALL.
CSF MRD assessment by MFC can be easily incorporated into routine clinical practice since laboratories performing MFC bone marrow MRD analysis will also be capable of performing MFC CSF analysis. Although CSF MRD positivity by MFC increases the risk of CNS relapse, no specific recommendations for how to modulate that risk are available. Additional studies evaluating NGS and/or PCR-based MRD in CSF would help to clarify the role of MRD in CNS leukemia monitoring.
Conclusion
Although the utility of MRD in ALL has been well defined over the last decades, several questions remain, including the extent to which increasing the depth of MRD assessment adds prognostic value, the role of ongoing MRD monitoring once patients achieve MRD response, the extent to which MRD assessment of the peripheral blood can be substituted for bone marrow, and whether MRD assays should be applied to the analysis of the CNS. Current and future studies should help to answer many of these important questions and will continue to clarify the role of MRD evaluation in ALL.