Why does the R-squared change when I fix the intercept in my regression line from its least squares value?
The R-squared represents 1 - the residual sum of squares for a regression line which represents the squared differences of the points about the line. The line is defined by the intercept, where the predictor,x, equals zero, and the slope, which represents the correlation between the predictor and the response when they are both standardised.
In particular the slope and intercept are chosen to minimize the squared value:
$$\mbox{(observed response at predictor value, x - predicted response at x)}^text{2}$$
and hence are called least squares estimates.
Changing either the intercept or slope from their "optimal" least squares values as estimated by, for example, a statistical package increases the lack of fit as measured by the residual sum of squares and, consequently, decreases R-squared.