Linear trend
From Wikipedia, the free encyclopedia
In time-series analysis, linear trend is a simple method of forecasting variations in a quantitative variable. Namely, least-squares regression methods are used to regress a quantitative dependent varibale Y (to be predicted) on a constant C and an equally-distant time variable t, the coefficient of which will express the trend variation.
where s is a time index.