Behavioral and neurophysiological findings in vision suggest that perceptual grouping is not a unitary process and that different grouping principles have different processing requirements and neural correlates. The present study aims to examine whether the same occurs in the haptic modality using two grouping principles widely studied in vision, spatial proximity and texture similarity. We analyzed behavioral responses (accuracy and response times) and conducted an independent component analysis of brain oscillations in alpha and beta bands for haptic stimuli grouped by spatial proximity and texture similarity, using a speeded orientation detection task performed on a novel haptic device (MonHap). Behavioral results showed faster response times (RTs) for patterns grouped by spatial proximity relative to texture similarity. Independent component clustering analysis revealed the activation of a bilateral network of sensorimotor and parietal areas while performing the task. We conclude that, as occurs in visual perception, grouping the elements of the haptic scene by means of their spatial proximity is faster than forming the same objects by means of texture similarity. In addition, haptic grouping seems to involve the activation of a network of widely distributed bilateral sensorimotor and parietal areas as reflected by the consistent event-related desynchronization found in alpha and beta bands.