Each format has it's own pros and cons.
I don't believe CD audio is compressed, unless the recording studios do some weird stuff nowadays.
It's just sampled and quantized -- this is not compression, it's analog to digital conversion.
I don't see why it would be compressed because CD's have more than enough space to hold an hour+ of 16-bits at 44,100 samples per second, which is sufficient for the majority of popular music anyway.
Compression would be there only if the recording studio manipulated the recordings in a compressed format prior to putting it onto the CD, which to me sounds crazy but what do I know.
Ripping vinyl would have to undergo the same analog to digital conversion, though not being limited by CD space I guess you could sample at higher resolution (24-bit?) if you really wanted... that probably wouldn't add much to it. I guess if artists chose to do purely digital distribution they could do that as well for all music.
Vinyl may be analog but lossiness and flaws are not limited to digital. Aside from the fact that all electronics add noise and distortion, vinyl introduces mechanical imperfections, the most obvious being dust.
This can definitely be quantified. Think of the physical size of the vinyl features -- physical length per second, average height of depression per dB of audio, average manufacturing tolerance, physical warping of the disc, dust size/amount, whatever.
Record a perfect sinusoid onto a vinyl and you can very easily get noise and distortion figures.
I think the hissing and popping is pretty obvious to everyone.
Maybe a very well kept vinyl would be better but in the best case all you're avoiding is quantization, and at 16-bits that's about 0.001% error. You can lose a few bits from noise, but 13-bit is still 0.01% error.
The 44.1 kHz sampling frequency is more than good enough to capture it perfectly assuming the analog-to-digital converter isn't horribly designed.
Anyway, the key is what Sitwon keeps mentioning -- high bit rate.
Personally I feel like I can notice 192kbps MP3's (on good speakers anyway) but anything lower I think it becomes clear. Often I'm convinced the radio plays poorly compressed audio which is very distinct from general radio noise (i.e. analog stuff).
It's all about noticeable compression artifacts.
I don't know how to describe it but it's a very unique sound. If you listen to an MP3 with less than 96 kbps or recompress an mp3 several times you'll definitely hear the problem.
A low quality JPEG shows the visual equivalent of compression artifacts.
In general the artifacts are worst at rapid transitions like clapping in audio or black/white transitions in an image.
On the topic of audiophile gear I do think that comes down to preference and subjective opinion. I think it's pretty well documented that in a well designed amplifiers tubes present absolutely awful distortion (< 3%) compared to solid state (< 0.01%) but some people prefer the sound of tube distortion and it's as simple as that. Distortion from transistors, tubes, capacitors, cables or whatever comes down to the design of the amplifier so the bottom line is that no specific technology is good or bad. In any case I think it's pretty clear that modern amplifiers can be superior to the human ear's capabilities. Also notice that these amplifier distortion numbers are worse than the quantization from analog-to-digital conversion...
Check out this site:
http://sound.westhost.com/absw.htm
It's a very good audio electronics design site, and that page I linked is pretty interesting.
At the end of the day I buy CD's and rip them into MP3's at 192 kbps.
If I had a proper sound system with good speakers I would listen to the CD's or rip them at a higher bit rate. Of course I would first listen to my 192-kbps to see whether I actually cared enough to spend that effort.