• greenskye@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    5 months ago

    I’ve always kinda wondered about this. I’m not an audio guy and really can’t tell the difference between most of the standards. That said, I definitely remember tons and tons ‘experts’ telling me that no one can tell the difference between 720p and 1080p TV at typical distance to your couch. And I absolutely could and many of the people I know could. I can also tell the difference between 1080 and 4k, at the same distances.

    So I’m curious if there’s just a natural variance in an individual’s ability to hear and audiophiles just have a better than average range that does exceed CD quality?

    Similar to this, I can tell the difference between 30fps and 60fps, but not 60 to 120, yet some people swear they can. Which I believe, I just know that I can’t. Seems like these guidelines are probably more averages, rather than hard biological limits.

    • aleph@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      5 months ago

      It’s a fair question. Human hearing ability is a spectrum like anything else, however when it comes to discerning the difference in audio quality, the vast, vast majority of people cannot reliably tell the difference between high-bitrate lossy and lossless when they do a double blinded test. And that includes audiophiles with equipment worth thousands of dollars.

      Of that tiny minority who can consistently distinguish between the two, they generally can only tell by listening very closely for the very particular characteristics of the encoder format, which takes a highly trained ear and a lot of practice.

      The blind aspect is important because side-by-side comparisons (be they different audio formats, or 60fps vs 120fps video) are highly unreliable because people will generally subconsciously prefer the one they know is supposed to be better.

    • DjMeas@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      I think this is the case where certain people simply can’t see it here the difference.

      I collect video game and movie soundtracks and the main difference I can hear between a 320kbps VS a FLAC that’s in the 1000kbps range is not straight up “clarity” in the sense that something like an instrument is “clearer” but rather the spacing and the ability to discern the difference where instruments come from is much better in a Hi-Res file with some decent wired headphones (my pair is $200). All this likey doesn’t matter much though when most users stream via Spotify which sounds worse than my 320kbps locally and people are using Bluetooth headphones at lower bitrates since they don’t have better codec compatibility like aptX and LDAC.

    • oo1@lemmings.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      i think hi res is for professional work. If you’re going to process, modify, mix, distort the audio in a studio, you probably want the higher bit depth or rate to start with, in case you amplify or distort something and end up with an unintended artefact that is human audible. But the output sound can be down rated back to human levels before final broadcast.

      O couse if a marketing person finds out there is a such a thing as “professional quality”. . . See also “military spec”, “aerospace grade”

      • interolivary@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        Yeah to expand on this, in professional settings you’ll want a higher sampling frequency so you don’t end up with eg. aliasing, but for consumer use ≥44–48kHz sampling rate is pretty much pointless

    • bc93@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      There are two main factors that are important for screen resolution - screen size and view distance. If you have a small TV and watch it from a regular viewing distance, it’s pretty unlikely that you could tell the difference between 720p and 1080p, but if you have a large TV and you sit relatively close then the difference will be obvious.

      Similarly for audio - if you listen to sound with good quality headphones then you can hear it better, and there’s definitely a factor of frequency response that some humans have better than others, but that’s not really a big factor. Even if you have great hearing and you can hear frequencies as high as 22khz (which is extremely rare), then it’s not like you’ll get a lot of extra pleasure out of music just because you can hear a couple extra extremely high pitch sounds which others can’t. And even if you can hear 22khz, then CD audio can still replicate that. I don’t believe that anyone has ever been able to hear 48KHz, let alone 96KHz.

      There are people who can tell the difference between 128kbps MP3 and CD audio, but I do not believe there is anyone who can tell the difference between CD audio and 192KHz hi-res audio.

      (BTW, you should probably be able to tell the difference between 60Hz and 120Hz on a computer from using the menues and moving windows around, etc. - if you can’t, it’s likely that your computer isn’t configured correctly. Just to let you know!)