In physical human-robot interaction, robot behavior must be adjusted to forces applied by the human interaction partner. For measuring such forces, special-purpose sensors may be used, e.g. force-torque sensors, that are however often heavy, expensive and prone to noise. In contrast, we propose a machine learning approach for measuring external perturbations of robot behavior that uses commonly available, low-cost sensors only. During the training phase, behavior-specific statistical models of sensor measurements, so-called perturbation filters, are constructed using Principal Component Analysis, Transfer Entropy and Dynamic Mode Decomposition. During behavior execution, perturbation filters compare measured and predicted sensor values for estimating the amount and direction of forces applied by the human interaction partner. Such perturbation filters can therefore be regarded as virtual force sensors that produce continuous estimates of external forces.