I might not bet on being able to hear the difference between 96kHz and 192kHz, but I would bet on hearing the difference between 160kbps mp3 and CD, depending on the quality of the original source of course. But most 160kbps mp3s I've heard I've recognized as crap easily before checking the bitrate.
That's a better bet, but still a long shot. As I said, the codec matters, as well as the source audio, as well as the listener's training. It's easy to make the first state-of-the-art, it's unusual to find source audio that doesn't encode well, and listeners with trainign or native acuity good enough to tell good 160 kbps from source are rare.
Now, I suppose it's possible technology has developed to better encode mp3s at 160kbps with better sound quality. I'd have to hear it to believe it though.
MP3 codecs underwent substantial evolution in the 2000's. At this point it's all fine-tuning. And of course, no one has ever been obligated to stick to a constant 160 kbps bitrate; variable bitrate (VBR) centered around 192 kbps is what I use. Paranoiacs use 320kbps constant bitrate. All three of these offer massive filesize reduction (and thus greatly enhanced portability) compared to lossless, with little or no subjective audio degradation for most listeners and sources.