In 1929, Edwin Hubble investigated the relationship between distance of a galaxy from the earth and the velocity with which it appears to be receding. Galaxies appear to be moving away from us no matter which direction we look. This is thought to be the result of the "Big Bang". Hubble hoped to provide some knowledge about how the universe was formed and what might happen in the future. The data collected include distances (megaparsecs) to 24 galaxies and their recession velocities (km/sec). Note: 1 parsec = 3.26 light years.
Hubble's law is as follows:
Recession Velocity = Ho*Distance
where Ho is Hubble's constant thought to be about 75 km/sec/Mpc. Hence, for every additional Megaparsec away from the earth, a galaxy recedes faster by approximately 75 km/sec. By working backward in time, the galaxies appear to meet in the same place. Thus 1/Ho can be used to estimate the time since the "Big Bang" -- a measure of the age of the universe.
A regression analysis seems approriate; however, there is no "constant" term in the Hubble's law. We can verify that the constant term of the regression analysis is not significant at any reasonable level of alpha, and thus, we can estimate Hubble's constant, Ho by performing a no-intercept model regression. The value from the data is approximately 424.
Why 424 and not 75?
Does the regression make sense with the constant term = 0? (if the distance from the earth is zero, is the velocity from the earth 0?)
Does it then make sense to have negative recession velocities and model them with the same no-intercept model?
Note: Sandage and Tammann derive a value near 50 km/sec/Mpc for Hubble's constant