In patients presenting with obvious signs of glaucoma and vision loss, the decision to start glaucoma treatment is relatively straightforward. The benefit of initiating treatment, in terms of preventing further loss of vision and maintaining quality of life (QoL), generally greatly outweighs the negatives of treatment. Choosing to begin therapy in a glaucoma suspect, on the other hand, is a more difficult decision to make on a patient’s behalf. Even among glaucoma specialists, there can be significant uncertainty regarding the appropriateness of treatment initiation in glaucoma suspects.1 Within a 10- to 15-year span, one untreated glaucoma suspect may notice changes in visual function and progress to overt glaucoma, while the next suspect may remain stable. Glaucoma suspects bear risk factors, most notably elevated intraocular pressure (IOP) or a suspicious appearing optic disk, which predispose toward glaucoma but without evidence of frank glaucomatous damage. Ocular hypertension affects approximately 8 % of adults over the age of 40 in the US and is a well-studied risk factor defined as an IOP of >21 mmHg at baseline.2 Another major group of glaucoma suspects are those with clinically suspicious appearing optic disks in the absence of visual field loss that may demonstrate asymmetric cup-to-disk ratio (C/D), focal, or diffuse changes of the neuroretinal rim, disk hemorrhages, or suspicious nerve fiber layer alterations. Examination of the optic nerve head is probably the most important step in the diagnosis of glaucoma and these findings should alert the clinician to the possibility of early glaucoma.
One reason for the controversy surrounding treatment initiation is that once started, medical therapy is generally continued indefinitely. The ophthalmologist must constantly balance the risk for possible long-term irreversible visual disability against life expectancy, treatment side effects, financial impact, and negative effects on QoL. Ultimately, the goal of therapy is not to lower IOP but to preserve functional vision as well as QoL. In this article we will review risk factors for progression to glaucoma in glaucoma suspects and how to synthesize this information when deciding to initiate treatment in these patients.
Risk Assessment—Whether Or Not To Treat
The biggest challenge in deciding on whom to start treatment is the inability to predict with certainty which glaucoma suspect will progress to visual dysfunction and which will remain stable. This decision involves simultaneously taking into account all known risk factors with which a patient may present in order to arrive at an estimate of his or her probability of developing glaucoma. The most important predictors for developing glaucomatous damage are the extent of damage already present and the current rate of disease progression.3,4 Although progressive change in the optic disk or visual field is the defining feature of glaucoma, generally, it is not possible to assess change on initial patient encounters, since rates of progression can only be observed over time. Therefore, one usually relies on static risk factors to assess risk. However, in patients who are being monitored as glaucoma suspects, subsequent examination and evidence of progression may prompt the clinician to reassess risk and perhaps initiate treatment.
Elevated IOP has been shown in almost all population-based studies to be a leading risk factor for the presence or development of glaucoma,5 and both the incidence and prevalence of glaucoma increases with increasing IOP.6 Advancing age is also identified as a risk factor in numerous studies, wherein prevalence exponentially increases with age, although the age-dependent increase differs among races. The odds ratio (OR) of open angle glaucoma per decade is 2.05 in white, 1.61 in black, and 1.57 in Asian populations.7 The Ocular Hypertension Treatment Study (OHTS) additionally identified central corneal thickness (CCT), pattern standard deviation (PSD), and vertical and horizontal C/D as independent predictive variables for the development of glaucomatous damage in ocular hypertensives.8 Other factors including race,9–11 sex,12 family history,12–14 myopia,15,16 vasospasm,17,18 and alterations in the cardiovascular system19–24 also play a role in assessing a person’s overall risk for developing glaucoma.
Given the large number of possible risk factor combinations, coupled with the fact that not all risk factors are equally weighted, it is not surprising that ophthalmologists assess glaucoma risk inconsistently.25 Compared with formal, quantitative risk calculation, clinical assessment often leads to the underestimation of an individual’s actual risk for developing glaucoma.25,26 Clinicians, in general, are relatively poor assessors of risk. This tendency to underestimate risk is well established in the field of cardiology, where physicians and nurses have been shown to have only moderate accuracy in evaluating the risk for cardiovascular events.27–29 To help better stratify patients based on their individual risk level, the Framingham Heart Study investigators developed a formal risk calculator that has since been emulated by other large prospective studies of cardiovascular disease. These calculators, each applicable to certain patient populations, are readily available over the Internet, although they do not appear to be regularly used in clinical practice30,31 for various reasons, such as time constraints.32,33
The innate complexity of the decision-making process for glaucoma suspects has led to the development of a formal risk calculator and other tools to help ophthalmologists simplify the evaluation of risk. The most notable of these is the Scoring Tool for Assessing Risk (STAR) developed from the findings of OHTS. Available on the Internet and through mobile phone applications, the calculator estimates the 5-year risk for progressing from ocular hypertension to early glaucoma. It uses a mathematical formula, borrowed from the Framingham investigators, to calculate risk based on six variables (age, IOP, PSD, CCT, vertical C/D ratio, diabetes). If the calculated risk is <5 % it recommends observation; if it is >15 % it recommends treatment. In the 5–15 % range, no explicit recommendation is made. The validity of this tool was assessed in an independent population of 126 ocular hypertensive subjects prospectively followed in the Diagnostic Innovations in Glaucoma Study (DIGS). This study demonstrated that the STAR calculator was able to discriminate DIGS subjects who developed glaucoma from those who did not in approximately 70 % of cases (c-indices of 0.68 [full model] and 0.73 [reduced model]).34 The updated STAR II calculator (which excludes diabetes as a factor) was also validated in the combined OHTS and European Glaucoma Prevention Study (EGPS) dataset, which confirmed good calibration and discrimination with a c-index of 0.74.35 Another calculator based on OHTS data was developed by Cioffi and Mansberger (available at www.discoveriesinsight.org).
While the STAR II calculator is a useful clinical tool, it does have limitations. It is reliably applicable only to patients who resemble the recruited subjects of OHTS—age 30–80, IOP of first eye 24–32 mmHg, 21–32 mmHg in second eye, best corrected visual acuity (BCVA) 20/40 or better, spherical refraction within ± 5D, cylinder correction within ± 3D, gonioscopically open angles, and no evidence of glaucomatous damage by nerve appearance or visual field. These criteria exclude many of the glaucoma suspects seen clinically and for whom a decision to treat or observe must be made. While the calculator has been validated in an independent population of patients of mostly European descent, it may not be valid in other populations. Additionally, important risk factors that contribute to the development of glaucoma, e.g. family history, are not included. Other factors that affect the decision to initiate treatment in real-life situations are not accounted for by the STAR, such as life expectancy, cost of treatment, effect on QoL, and patient preferences. Finally, the level of risk at which treatment initiation is recommended to prevent the development of the earliest signs of glaucoma was set at an arbitrary level by consensus opinion and not on validated outcomes.
For example, using the STAR calculator, a 78-year-old patient with an IOP of 18–23 mmHg in one eye and 20–21 mmHg in the fellow eye, CCT of 549 and 537 μm, C/D of 0.6 and 0.5, and PSD of 2.9 and 2.1 has an approximately 30 % risk for developing glaucoma in 5 years, which places this patient at high risk. If this patient also has metastatic lung cancer and has suffered several cerebrovascular accidents, a clinician may opt to monitor this patient off treatment contrary to the results of the risk calculator given the advanced age, multiple comorbidities that will limit lifespan, and otherwise functional current level of vision. However, if a healthy 60-year-old presents with similar ocular and visual field findings, the risk calculator predicts a 5-year risk for approximately 21 %, which is slightly lower but still within the high-risk category. In this patient, treatment most likely should be initiated according to the risk calculator results given the longer life expectancy and potential for functional visual loss without treatment.
To address some of the STAR calculator limitations, the RAND-UCLA Appropriateness Method (RAM) set out to identify those glaucoma suspects in whom treatment is appropriate or inappropriate. The RAM is a well-known method used to make recommendations for medical problems lacking evidence from randomized controlled trials, or when the evidence available from such trials provides an insufficient level of detail applicable to a wide range of patients. The RAM process involves an extensive literature review and the development of hypothetical clinical scenarios based on that review. These scenarios are presented to an expert panel for at least two rounds of scenario rating, both before and after discussion of the literature review. Statisticians then perform analyses of the expert panel scenario ratings to create tools that can predict the collective judgment of the expert panel. The results of this method have predictive validity and have been used in many areas of medicine,36,37 including ophthalmology.38 In 2008, a RAM was initiated to ask the question, “For which glaucoma suspect is treatment appropriate?” The Glaucoma Suspect RAM developed a point system, based on ANOVA statistical analysis, which includes variables missing from the STAR II calculator, notably, life expectancy, family history, and disk size.2 PSD was not included in the RAM point system tool, as it was believed that most clinicians do not utilize PSD for risk assessment. Points are assigned to all the variables of an individual patient, and the final score predicts the panel’s rating of inappropriate, indeterminate, or appropriate to treat (see Table 1). No predictive tool will be able to the incidence and prevalence of glaucoma increases with increasing IOP.6 Advancing age is also identified as a risk factor in numerous studies, wherein prevalence exponentially increases with age, although the age-dependent increase differs among races. The odds ratio (OR) of open angle glaucoma per decade is 2.05 in white, 1.61 in black, and 1.57 in Asian populations.7 The Ocular Hypertension Treatment Study (OHTS) additionally identified central corneal thickness (CCT), pattern standard deviation (PSD), and vertical and horizontal C/D as independent predictive variables for the development of glaucomatous damage in ocular hypertensives.8 Other factors including race,9–11 sex,12 family history,12–14 myopia,15,16 vasospasm,17,18 and alterations in the cardiovascular system19–24 also play a role in assessing a person’s overall risk for developing glaucoma.
Given the large number of possible risk factor combinations, coupled with the fact that not all risk factors are equally weighted, it is not surprising that ophthalmologists assess glaucoma risk inconsistently.25 Compared with formal, quantitative risk calculation, clinical assessment often leads to the underestimation of an individual’s actual risk for developing glaucoma.25,26 Clinicians, in general, are relatively poor assessors of risk. This tendency to underestimate risk is well established in the field of cardiology, where physicians and nurses have been shown to have only moderate accuracy in evaluating the risk for cardiovascular events.27–29 To help better stratify patients based on their individual risk level, the Framingham Heart Study investigators developed a formal risk calculator that has since been emulated by other large prospective studies of cardiovascular disease. These calculators, each applicable to certain patient populations, are readily available over the Internet, although they do not appear to be regularly used in clinical practice30,31 for various reasons, such as time constraints.32,33
The innate complexity of the decision-making process for glaucoma suspects has led to the development of a formal risk calculator and other tools to help ophthalmologists simplify the evaluation of risk. The most notable of these is the Scoring Tool for Assessing Risk (STAR) developed from the findings of OHTS. Available on the Internet and through mobile phone applications, the calculator estimates the 5-year risk for progressing from ocular hypertension to early glaucoma. It uses a mathematical formula, borrowed from the Framingham investigators, to calculate risk based on six variables (age, IOP, PSD, CCT, vertical C/D ratio, diabetes). If the calculated risk is <5 % it recommends observation; if it is >15 % it recommends treatment. In the 5–15 % range, no explicit recommendation is made. The validity of this tool was assessed in an independent population of 126 ocular hypertensive subjects prospectively followed in the Diagnostic Innovations in Glaucoma Study (DIGS). This study demonstrated that the STAR calculator was able to discriminate DIGS subjects who developed glaucoma from those who did not in approximately 70 % of cases (c-indices of 0.68 [full model] and 0.73 [reduced model]).34 The updated STAR II calculator (which excludes diabetes as a factor) was also validated in the combined OHTS and European Glaucoma Prevention Study (EGPS) dataset, which confirmed good calibration and discrimination with a c-index of 0.74.35 Another calculator based on OHTS data was developed by Cioffi and Mansberger (available at www.discoveriesinsight.org).
While the STAR II calculator is a useful clinical tool, it does have limitations. It is reliably applicable only to patients who resemble the recruited subjects of OHTS—age 30–80, IOP of first eye 24–32 mmHg, 21–32 mmHg in second eye, best corrected visual acuity (BCVA) 20/40 or better, spherical refraction within ± 5D, cylinder correction within ± 3D, gonioscopically open angles, and no evidence of glaucomatous damage by nerve appearance or visual field. These criteria exclude many of the glaucoma suspects seen clinically and for whom a decision to treat or observe must be made. While the calculator has been validated in an independent population of patients of mostly European descent, it may not be valid in other populations. Additionally, important risk factors that contribute to the development of glaucoma, e.g. family history, are not included. Other factors that affect the decision to initiate treatment in real-life situations are not accounted for by the STAR, such as life expectancy, cost of treatment, effect on QoL, and patient preferences. Finally, the level of risk at which treatment initiation is recommended to prevent the development of the earliest signs of glaucoma was set at an arbitrary level by consensus opinion and not on validated outcomes.
For example, using the STAR calculator, a 78-year-old patient with an IOP of 18–23 mmHg in one eye and 20–21 mmHg in the fellow eye, CCT of 549 and 537 μm, C/D of 0.6 and 0.5, and PSD of 2.9 and 2.1 has an approximately 30 % risk for developing glaucoma in 5 years, which places this patient at high risk. If this patient also has metastatic lung cancer and has suffered several cerebrovascular accidents, a clinician may opt to monitor this patient off treatment contrary to the results of the risk calculator given the advanced age, multiple comorbidities that will limit lifespan, and otherwise functional current level of vision. However, if a healthy 60-year-old presents with similar ocular and visual field findings, the risk calculator predicts a 5-year risk for approximately 21 %, which is slightly lower but still within the high-risk category. In this patient, treatment most likely should be initiated according to the risk calculator results given the longer life expectancy and potential for functional visual loss without treatment.
To address some of the STAR calculator limitations, the RAND-UCLA Appropriateness Method (RAM) set out to identify those glaucoma suspects in whom treatment is appropriate or inappropriate. The RAM is a well-known method used to make recommendations for medical problems lacking evidence from randomized controlled trials, or when the evidence available from such trials provides an insufficient level of detail applicable to a wide range of patients. The RAM process involves an extensive literature review and the development of hypothetical clinical scenarios based on that review. These scenarios are presented to an expert panel for at least two rounds of scenario rating, both before and after discussion of the literature review. Statisticians then perform analyses of the expert panel scenario ratings to create tools that can predict the collective judgment of the expert panel. The results of this method have predictive validity and have been used in many areas of medicine,36,37 including ophthalmology.38 In 2008, a RAM was initiated to ask the question, “For which glaucoma suspect is treatment appropriate?” The Glaucoma Suspect RAM developed a point system, based on ANOVA statistical analysis, which includes variables missing from the STAR II calculator, notably, life expectancy, family history, and disk size.2 PSD was not included in the RAM point system tool, as it was believed that most clinicians do not utilize PSD for risk assessment. Points are assigned to all the variables of an individual patient, and the final score predicts the panel’s rating of inappropriate, indeterminate, or appropriate to treat (see Table 1). No predictive tool will be able to synthesize every single risk factor for a given patient, and the same is true of the RAM. However, the strength of the RAM is that it is a validated method that provides guidance for a much broader range of glaucoma suspects than does the STAR II.
Risk-assessment tools may help simplify the decision to treat or observe a glaucoma suspect by providing a clinician with one more piece of information, primarily whether the patient is at low, intermediate, or high risk according to OHTS/EGPS data or according to an expert panel. The results can further validate the ophthalmologist’s own assessment or may make him or her re-evaluate the situation if the tool disagrees with his/her assessment. One of the biggest advantages of these tools is the ability to have a concrete number to share with patients when discussing their risk. Patients who fall into the indeterminate groups of the STAR II and RAM may pose the greatest clinical challenge as there is no explicit recommendation as there are no long-term studies showing their outcomes. Future studies should concentrate on this group of patients. Despite being based on evidence from strong randomized controlled studies, these tools should supplement, not substitute, a physician’s clinical decisions based on his or her interactions with the patient. There will always be legitimate reasons to provide recommendations that contradict those of the risk-assessment tools.
Treatment Initiation In The Glaucoma Suspect—When To Treat
As previously mentioned, glaucoma suspects can be categorized into two groups: subjects with significant risk factors for the future development of glaucoma (e.g., increased IOP), and subjects with very early glaucomatous damage that cannot definitely be distinguished from normal (e.g., suspicious appearance of optic disk). The matter of treatment initiation in the former type of glaucoma suspect was investigated in the OHTS. Although OHTS showed a significant reduction in the development of glaucoma with treatment, approximately 90 % of patients in the observation group did not develop glaucoma during the first 5 years of the study.39 Given that established glaucoma is known to progress slowly for the vast majority of patients,40 and that there is little evidence that health-related QoL is affected until there is moderate or marked damage on visual fields,41 some experts propose withholding treatment from even moderate- to high-risk ocular hypertensives and rather follow them on an annual or semi-annual basis until glaucomatous damage is demonstrated.42 A pivotal consideration in treatment initiation is whether or not delaying therapy will result in harm to the patient. In phase II of OHTS,43 all subjects were invited to continue under the same protocol except that the original observation group subjects were invited to start treatment. By the end of phase II, subjects had either been treated for a median of 13 years (early treatment) or 5.5 years after an initial 7.5 years of observation (delayed treatment). The penalty for postponing treatment was examined by comparing outcomes of the delayed- versus early-treatment groups. The cumulative proportion of developing primary open angle glaucoma (POAG) was 0.22 in the delayed-treatment group versus 0.16 in the early-treatment group (27 % reduction), a modest penalty for delaying treatment overall. When examining subgroups by risk, there was a greater benefit of early treatment in the high-risk group where the number-needed-to-treat (NNT) to prevent one case of early POAG, defined as either a reproducible visual field abnormality or a clinically significant optic disk deterioration in one or both eyes, was 7 compared with 98 in the low-risk group. The protective effect of treatment was observed for both visual field abnormalities and optic disk deterioration and, compared with the original medication group, participants in the original observation group developed both structural and functional POAG endpoints more frequently. It is important to keep in mind that the concept of NNT is one of frequency, not utility. The decision of whether or not to treat an individual based on a NNT of 7 remains at the discretion of the clinician who must assess the individual patient’s risk profile and their lifetime risk. The OHTS did not evaluate the ultimate goals of glaucoma therapy, namely the prevention of visual impairment or blindness. Rather, the endpoint of OHTS was the onset of early glaucomatous damage, and therefore the question still remains whether there is benefit to earlier treatment of ocular hypertensives as a group.
It is interesting to note that in OHTS, participants in both the early and delayed groups continued to develop POAG throughout the study. In their discussion, the study authors comment that participants seemed to segregate into those destined to develop POAG and those who remained stable throughout follow-up. Individuals who eventually developed POAG had slightly worse values of mean deviation (MD) and PSD at baseline, possibly suggesting a subclinical prodrome before conventional clinical glaucomatous damage could be detected. For the glaucoma suspect with a suspicious appearing disk (rather than with ocular hypertension), there are no large randomized controlled synthesize every single risk factor for a given patient, and the same is true of the RAM. However, the strength of the RAM is that it is a validated method that provides guidance for a much broader range of glaucoma suspects than does the STAR II.
Risk-assessment tools may help simplify the decision to treat or observe a glaucoma suspect by providing a clinician with one more piece of information, primarily whether the patient is at low, intermediate, or high risk according to OHTS/EGPS data or according to an expert panel. The results can further validate the ophthalmologist’s own assessment or may make him or her re-evaluate the situation if the tool disagrees with his/her assessment. One of the biggest advantages of these tools is the ability to have a concrete number to share with patients when discussing their risk. Patients who fall into the indeterminate groups of the STAR II and RAM may pose the greatest clinical challenge as there is no explicit recommendation as there are no long-term studies showing their outcomes. Future studies should concentrate on this group of patients. Despite being based on evidence from strong randomized controlled studies, these tools should supplement, not substitute, a physician’s clinical decisions based on his or her interactions with the patient. There will always be legitimate reasons to provide recommendations that contradict those of the risk-assessment tools.
Treatment Initiation In The Glaucoma Suspect—When To Treat
As previously mentioned, glaucoma suspects can be categorized into two groups: subjects with significant risk factors for the future development of glaucoma (e.g., increased IOP), and subjects with very early glaucomatous damage that cannot definitely be distinguished from normal (e.g., suspicious appearance of optic disk). The matter of treatment initiation in the former type of glaucoma suspect was investigated in the OHTS. Although OHTS showed a significant reduction in the development of glaucoma with treatment, approximately 90 % of patients in the observation group did not develop glaucoma during the first 5 years of the study.39 Given that established glaucoma is known to progress slowly for the vast majority of patients,40 and that there is little evidence that health-related QoL is affected until there is moderate or marked damage on visual fields,41 some experts propose withholding treatment from even moderate- to high-risk ocular hypertensives and rather follow them on an annual or semi-annual basis until glaucomatous damage is demonstrated.42 A pivotal consideration in treatment initiation is whether or not delaying therapy will result in harm to the patient. In phase II of OHTS,43 all subjects were invited to continue under the same protocol except that the original observation group subjects were invited to start treatment. By the end of phase II, subjects had either been treated for a median of 13 years (early treatment) or 5.5 years after an initial 7.5 years of observation (delayed treatment). The penalty for postponing treatment was examined by comparing outcomes of the delayed- versus early-treatment groups. The cumulative proportion of developing primary open angle glaucoma (POAG) was 0.22 in the delayed-treatment group versus 0.16 in the early-treatment group (27 % reduction), a modest penalty for delaying treatment overall. When examining subgroups by risk, there was a greater benefit of early treatment in the high-risk group where the number-needed-to-treat (NNT) to prevent one case of early POAG, defined as either a reproducible visual field abnormality or a clinically significant optic disk deterioration in one or both eyes, was 7 compared with 98 in the low-risk group. The protective effect of treatment was observed for both visual field abnormalities and optic disk deterioration and, compared with the original medication group, participants in the original observation group developed both structural and functional POAG endpoints more frequently. It is important to keep in mind that the concept of NNT is one of frequency, not utility. The decision of whether or not to treat an individual based on a NNT of 7 remains at the discretion of the clinician who must assess the individual patient’s risk profile and their lifetime risk. The OHTS did not evaluate the ultimate goals of glaucoma therapy, namely the prevention of visual impairment or blindness. Rather, the endpoint of OHTS was the onset of early glaucomatous damage, and therefore the question still remains whether there is benefit to earlier treatment of ocular hypertensives as a group.
It is interesting to note that in OHTS, participants in both the early and delayed groups continued to develop POAG throughout the study. In their discussion, the study authors comment that participants seemed to segregate into those destined to develop POAG and those who remained stable throughout follow-up. Individuals who eventually developed POAG had slightly worse values of mean deviation (MD) and PSD at baseline, possibly suggesting a subclinical prodrome before conventional clinical glaucomatous damage could be detected. For the glaucoma suspect with a suspicious appearing disk (rather than with ocular hypertension), there are no large randomized controlled studies of outcomes with and without treatment. For these patients, the clinician must rely on clinical experience and extrapolate information from studies whose subjects may not resemble his or her patient. Studies have demonstrated that treatment is more effective at preventing damage from glaucoma when initiated at an earlier stage of the disease.4 So, the more suspicious the nerve appears, if nerve fiber layer loss is detected, or if other significant risk factors exist for glaucoma, the more likely it is that treatment should be initiated. For example, a number of studies have shown that patients who have glaucoma tend to have thinner CCT.44–46 The presence or absence of this particular risk factor in two otherwise similar patients may sway one to treat or observe. Likewise, the presence of family history or secondary causes of glaucoma (like pseudoexfoliation) may favor the decision to initiate treatment for a glaucoma suspect because these factors increase the risk for progressing to glaucoma when there is a suspicious disk.
The optic nerve exam is paramount in all patients but particularly in glaucoma suspects diagnosed by disk appearance. Since the presence of documented, progressive glaucomatous optic disk changes has been noted as the best currently available reference standard for the diagnosis of glaucoma,47 it is important to have an accurate baseline against which to make future comparisons. However, subjective examination disparity and the inherent overlapping variability in the size and shape of optic disks among normal patients and those with glaucoma can limit optic nerve assessment. When assessing whether a neuroretinal rim is possibly glaucomatous, optic disk size should be taken into account because it will influence the interpretation of rim loss.48 The average optic disk has a C/D of 0.3 to 0.4, and an approximate area of 2.1 to 2.7 mm2,49 but wide variability exists. Moreover, the significance of the cup size depends on the size of the disk.50 Despite our knowledge that cup size is related to disk size, it can still be difficult to determine whether a large disk with a high C/D is pathologic or physiologic. Substantial axonal loss may develop before defects occur on kinetic perimetry51 and therefore a normal visual field test does not rule out glaucoma. Evaluation of the nerve fiber layer may be particularly important in these glaucoma suspect cases.
Imaging modalities, such as optical coherence tomography (OCT), scanning laser polarimetry, and confocal scanning laser ophthalmoscopy, have taken on a prominent role as a reference standard for the diagnosis of glaucoma.52 The DIGS elucidated the importance of baseline retinal nerve fiber layer (RNFL) thickness as a prognostic measure for future glaucomatous damage.53 It demonstrated that only 20 % of eyes with an average RNFL thickness greater than the lowest quartile developed visual field damage or progressive glaucomatous optic neuropathy compared with 40 % of eyes within the lowest quartile. In glaucoma suspects, particular attention to those with a thinner average superior and inferior RNFL thickness at baseline alone, or in combination with age, IOP, PSD, CCT, and stereophotographic assessment, may hold predictive value in the development of repeatable visual field damage and progressive glaucomatous optic neuropathy.53 Although these imaging studies are highly reproducible and provide good correlation with structural and functional tests, they are complementary tools that are most useful in confirming and monitoring the clinical diagnosis of glaucoma. Therefore, in a patient in whom the probability of having glaucoma as measured by clinical examination is low, imaging studies alone are unlikely to lead to the diagnosis of glaucoma. Indeed, no test has been shown to be superior to qualitative evaluation of optic disk stereophotographs;54 however, recent studies have suggested that Moorfields Regression Analysis55 and OCT56 may be useful in predicting progression in a subset of glaucoma suspects.
Principles For Treatment Initiation— Quality of Life
It is essential to involve patients in the decision to start therapy and in the choice of a treatment plan. Actively educating patients can improve adherence,57–59 and patients have a right to know about side effects, costs, and how treatment will affect their QoL. In glaucoma and glaucoma suspects, QoL can fall dramatically for several reasons, including worry and anxiety at having the diagnosis of a chronic disease, functional loss, inconvenience of treatment, side effects of treatment, and cost of therapy. Few studies are available that compare different treatment modalities and their effects on QoL, although the number is increasing.60,61 The Collaborative Initial Glaucoma Treatment Study (CIGTS) evaluated QoL as one outcome measure comparing patients receiving medications versus trabeculectomy and found no significant difference between the two groups with early glaucoma.62 A more recent study confirms the similarity in QoL scores between medically and surgically treated moderate and advanced glaucoma, but suggests that diagnosing and performing surgery in early-stage glaucoma may significantly negatively impact a patient’s psychological QoL.63 Presumably, aggressive initial treatment in glaucoma suspects would have the same effect.
In addition to topical therapy, laser trabeculoplasty, both argon and selective types, is an alternative initial or replacement treatment for glaucoma suspects given its efficacy and relatively safe side-effect profile.64–66 The Glaucoma Laser Trial showed that in newly diagnosed open angle glaucoma, argon laser trabeculoplasty was at least as effective as initial treatment with timolol maleate 0.5 % even after 7 years67,68 and no difference in health-related QoL was found between groups. Selective laser trabeculoplasty (SLT), which uses less energy and is less invasive, presumably has a similar effect on QoL, making it a good first-line choice for patients who want laser over medications.
Conclusion
Deciding when to treat a patient to prevent glaucomatous damage cannot be simplified into a straightforward algorithm. Several considerations, both objective (e.g., optic nerve status, IOP level, visual field defects) and subjective (e.g., patient lifestyle, adherence), must be taken into account. These factors must be weighed against our growing knowledge of how and why glaucoma develops and whether treatment is efficacious. Fortunately, a great deal of active investigation has guided our decision-making process over the last decade, permitting treatment decisions based on sound data from large-scale, prospective, randomized clinical trials.