The brain must interpret sensory input from diverse receptor systems to estimate object properties. Much has been learned about the brain mechanisms behind these processes in vision, while our understanding of haptic perception remains less clear. Here we examined haptic judgments of object size, which require integrating multiple cutaneous and proprioceptive afferent signals, as a model problem. To identify candidate human brain regions that support this process, participants (N=16) in an event-related fMRI experiment grasped objects to categorise them as one of four sizes. Object sizes were calibrated psychophysically to be equally distinct for each participant. We applied representational similarity logic to whole-brain, multi-voxel searchlight analyses to identify brain regions that exhibit size-relevant voxelwise activity patterns. Of particular interest was to identify regions for which more similar sizes produce more similar patterns of activity, which constitutes evidence of a metric size code. Regions of the intraparietal sulcus and the lateral prefrontal cortex met this criterion, both within-hands and across-hands. We suggest that these regions compute representations of haptic size that abstract over the specific peripheral afferent signals generated in a grasp. Results of a matched visual size task, performed by the same participants and analysed in the same fashion, identified similar regions, indicating that these representations may be partly modality-general. We consider these results with respect to perspectives on magnitude estimation in general and to computational views on perceptual signal integration.