The sensitivity of nuclear magnetic resonance (NMR) probes, especially the recently introduced cryogenic probes, can be substantially reduced by the electrical noise generated by conductive samples. In particular, samples of biological macromolecules, which usually contain salts to keep the pH constant and to prevent aggregation, can experience a significant reduction in sensitivity. So far this dependence has forced researchers to minimize the salt concentrations in their samples. Here we demonstrate that the decisive factor is not the salt concentration itself but the conductivity which is a function of both the concentration and the mobility of the ions in solution. We show that by choosing buffers with low ionic mobility, the sample conductivity can be dramatically reduced and the sensitivity substantially enhanced compared to the same measurement with an equal concentration of a standard NMR buffer such as phosphate. We further show that the highest sensitivity gain of one buffer over another buffer is equal to the square root of the ratio of their ion mobilities and describe a simple method to evaluate the effect of a certain buffer on the sensitivity.