JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, cilt.22, sa.1, ss.15-19, 2011 (SCI-Expanded)
In fuzzy time series analysis, the determination of the interval length is an important issue. In many researches recently done, the length of intervals has been intuitively determined. In order to efficiently determine the length of intervals, two approaches which are based on the average and the distribution have been proposed by Huarng [4]. In this paper, we propose a new method based on the use of a single variable constrained optimization to determine the length of interval. In order to determine optimum length of interval for the best forecasting accuracy, we used aMATLAB function which is employing an algorithm based on golden section search and parabolic interpolation. Mean square error is used as a measure of forecasting accuracy so the objective function value is mean square error value for forecasted observations. The proposed method was employed to forecast the enrollments of the University of Alabama to show the considerable outperforming results.