In this paper, in a multivariate setting we derive near optimal rates of convergence in the minimax sense for estimating partial derivatives of the mean function for functional data observed under a fixed synchronous design over H\"older smoothness classes. We focus on the supremum norm since it corresponds to the visualisation of the estimation error, and is closely related to the construction of uniform confidence bands. In contrast to mean function estimation, for derivative estimation the smoothness of the paths of the processes is crucial for the rates of convergence. On the one hand, if the paths have higher-order smoothness than the order of the partial derivative to be estimated, the parametric $\sqrt n$ rate can be achieved under sufficiently dense design. On the other hand, for processes with rough paths of lower-order smoothness, we show that the rates of convergence are necessarily slower than the parametric rate, and determine a near-optimal rate at which estimation is still possible. We implement a multivariate local polynomial derivative estimator and illustrate its finite-sample performance in a simulation as well as for two real-data sets. To assess the smoothness of the sample paths in the applications we further discuss a method based on comparing restricted estimates of the partial derivatives of the covariance kernel.
翻译:暂无翻译