If the function is rather well-behaved in the domain [a,b] , we can use a fixed step size. If not, adaptive steps may be needed. Here we concentrate on fixed-step methods only. Let us try to generalize the above procedure by writing the step y_{i+1} in terms of the previous step y_i \begin{equation} y_{i+1}=y(t=t_i+h)=y(t_i) + h\Delta(t_i,y_i(t_i)) + O(h^{p+1}), \tag{13} \end{equation} where O(h^{p+1}) represents the truncation error. To determine \Delta , we Taylor expand our function y \begin{equation} y_{i+1}=y(t=t_i+h)=y(t_i) + h(y'(t_i)+\dots +y^{(p)}(t_i)\frac{h^{p-1}}{p!}) + O(h^{p+1}), \tag{14} \end{equation} where we will associate the derivatives in the parenthesis with \begin{equation} \Delta(t_i,y_i(t_i))=(y'(t_i)+\dots +y^{(p)}(t_i)\frac{h^{p-1}}{p!}). \tag{15} \end{equation}