Let’s consider the variation in respone Y \begin{align} S_{Y} = \sum_{i=1}^{n} (Y_{i} - \overline{Y})^{2} \end{align} and the variation in the response after removing the effect of inputs \begin{align} SS_{R} = \sum_{i=1}^{n} (Y_{i} - \theta_{0} - \theta_{1}x_{0})^{2} \end{align} and thus, \begin{align} S_{YY} - SS_{R} \end{align} is the variation explained by the inputs. We define $R^{2}$ as \begin{align} R^{2} = \frac{S_{YY} - SS_{R}}{S_{YY}} = 1 - \frac{SS_{R}}{S_{YY}} \end{align} $R^{2}$ is the proportion of total variance explained by the inputs. A value close to 1 implies most of the variance is explained by the inputs whereas 0 means little variance is explained by inputs.
It can also be shown that the absolute value of correlation coefficient between $x$ and $Y$ equals the coefficient of determination. Thus, we know the value of $R^{2}$ for simple linear regression directly by $r$.