In-memory computing (IMC) substantially improves the energy efficiency of deep neural network (DNNs) hardware by activating many rows together and performing analog computing. The noisy analog IMC induces some amount of accuracy drop in hardware acceleration, which is generally considered as a negative effect. However, in this work, we discover that such hardware intrinsic noise can, on the contrary, play a positive role in enhancing adversarial robustness. To achieve that, we propose a new DNN training scheme that integrates measured IMC hardware noise and aggressive partial sum quantization at the IMC crossbar. We show that this effectively improves the robustness of IMC DNN hardware against both adversarial input and weight attacks. Against black-box adversarial input attacks and bit-flip weight attacks, DNN robustness has improved by up to 10.5% (CFAR-10 accuracy) and 33.6% (number of bit-flips), respectively, compared to conventional DNNs.