This paper presents a new approach to datadriven modeling of isotropic haptic textures using frequencydecomposed neural networks from the contact acceleration data that are captured when a stylus is scanned on a textured surface with diverse scanning velocities and normal forces. We first describe a motorized texture scanner that has been developed for accurate and easy data collection under a wide variety of conditions. We then propose two neural network models with different topologies: a unified model that feeds all of acceleration data, scanning velocity, and normal force as input variables to a single large neural network and a decomposed model that consists of a number of smaller neural networks each trained with the acceleration data for a pair of scanning velocity and normal force. An experiment with real samples showed that the unified model has better cross-validation ability in terms of spectral rms errors and its performance is comparable to the best available in the literature. In addition, we present some preliminary results of anisotropic texture modeling achieved by extending the unified model.