The physical origin of the >0.1 GeV emission detected from gamma-ray bursts (GRBs) by the Fermi satellite has not yet been completely understood. In this work, we consider the GeV light curves of 10 GRBs with measured redshift detected by the Fermi Large Area Telescope (LAT). These light curves are characterized by a long-lived (≳102 seconds) emission, whose luminosity decays in time as a power law. While the decay rate is similar for all GRBs (i.e. LLAT ∝ t-1.2), the normalization spans about two orders of magnitude in luminosity. However, after re-normalizing the luminosities to the prompt energetics Eprompt the light curves overlap. We consider the scenario in which the temporally extended LAT emission is dominated by synchrotron radiation from electrons accelerated at the forward external shock. According to this model, at high energies (i.e. above the typical synchrotron frequencies) a small dispersion of the Eprompt-normalized light curves is expected. The fact that the LAT temporally extended emission follows this behaviour reinforces its interpretation in terms of afterglow radiation from external shocks. Assuming this scenario, we argue that the parameters ɛe and ηγ (i.e. the fraction of shock-dissipated energy gained by the electrons, and the efficiency of the mechanism producing the prompt radiation, respectively) must be narrowly distributed.