Here’s the testing process all headphones go through. So, you get reliable and factual information that helps you make a better decision. Last update: We updated the ANC and noise isolation testing process and explanation. When we get new headphones, we go through a process. First, listen to music, then accurately test sound quality with specialized tools, measure battery duration, check the features, and more. We learn their strengths and weaknesses, so you don’t have to. Based on how we test headphones, you get a better idea of whether you want to spend your hard-earned money on your desired headphones. Below, we go through our testing methods for each major category. CONTENTS (show more) Testing Sound Testing sound requires 2 methods. Objective measurements using specialized tools and careful listening by a trained ear. A real person listening to headphones to describe what to expect of the sound. Every human ear is different, so a testing rig cannot capture the complete picture. Our editor (Peter) subjectively assesses headphones and compares them to hundreds he’s already listened to. Sound in the real world isn’t the same as in sterile lab conditions. Therefore, we believe it’s best to combine objective data with real-life experience. A careful listener can pinpoint headphone characteristics, but brains often play tricks on us. Measurements, in an audible range from 20 to 20.000Hz, reassure our hypotheses and give objective testing results. Also, they reveal things that are hard to hear, like characteristics in the upper treble. More on that later. What Determines Sound Quality? There’s no objective best sound quality since it depends on a listener’s taste. Though, the Harman curve is probably the best approximation and the most widely accepted audio quality standard for headphones. Harman curve is close to neutral response, but not quite. Read more about the Harman target curve. After measuring the frequency response, creating a graph, and comparing it to the Harman curve, we still have to evaluate more subjective characteristics. Things that a graph can’t help us with, like: Is the sound clear? How detailed is the sound? And most importantly, how enjoyable is it? We listen to different songs we’re familiar with, evaluating the overall technical performance of specific headphones. P.S. While a pair of headphones might have a close to perfect Harman curve frequency response, it’s no guarantee you’ll like it too. Many people prefer emphasized deep bass frequencies that you can feel in your ears. In that case, the Harman curve won’t be your first choice. In the end, the most versatile option is headphones with effective equalization (EQ), where you can change the sound to your liking. Some companion apps let you tweak headphone’s sound using a custom equalizer. In evaluation, we focus on a particular frequency range: Bass We first listen to the bass (low range). The upper bass is responsible for adding warmth and body, whereas the sub-bass adds the punch. By listening to hip-hop and pop shows us the sub-bass extension and how well headphones handle bassy audio. Evaluating speed and control is also on our list. The best way to test that is by playing fast-paced music genres, like drum n’ bass or power metal. We focus on hearing individual beats and double pedal hits. Mid-range Mid-range is equally important even though far less popular than bass or treble. In each song, we ask questions like: Do instruments sound natural? How well do headphones render vocals? Is there a difference between male and female vocals, or do both sound natural? How dynamic is the sound? (the difference between the quietest and the loudest part of the song) P.S. Loud snare drums are the best indicator of good dynamics in headphones (dynamic range is a difference between the quietest and loudest signal in the song). Treble Since treble (high-range) is the hardest to master for headphone manufacturers, we put extra effort into dissecting it. The treble test involves listening to well-produced tracks, where we look for natural-sounding cymbals. Cymbal crashes shouldn’t be too grainy, splashy, or artificial. Listening to instruments and vocals helps determine how much air is in the upper treble. Airiness makes listening experience more relaxing, creating a “room” around each instrument and vocal. Soundstage Soundstage describes the “spaceness” of sound. Is the music coming from one tiny point in the middle of your head, or can you hear the space that music is played in. We play selected tracks that sound wide and spacious on good headphones. We close our eyes and try to determine how far the individual sound is coming from. Do you hear it inside your head, right in front of you, or far away like in a live concert? The bigger the soundstage, the better. Imaging Imaging determines the directional accuracy of a given sound. In other words, how well can you pinpoint the direction the sound is coming from? It’s similar to soundstage but slightly different. We play a bunch of tracks that involve sounds coming from different directions. We want to know how easy it is to determine the sound’s direction and how many blurry points are in between the left, right, and center positions. Making Measurements Using Specialized Tools To measure the headphone response, you can’t use a smartphone or a microphone you have at home. It takes a bit more than that. We are using MiniDSP H.E.A.R.S. It’s a special measuring device with artificial soft ears that simulate a human head, with microphones in each ear. MiniDSP H.E.A.R.S. artificial ears with built-in microphones. Measuring with H.E.A.R.S. Artificial silicone ears help create a perfect seal and take into account how the sound behaves when bouncing around the pinna and inside the ear canal. We place headphones or earbuds on the artificial ears of MiniDSP H.E.A.R.S., calibrate the loudness, and play a sine sweep from 20 to 20.000Hz in each ear multiple times. MiniDSP is hooked to a computer where the app called REW recordings frequency response. In between each measurement, we slightly change the headphone’s position. By doing that, we take into account different wearing styles and how they change the frequency response. Since the MiniDSP “head” isn’t perfectly anatomically correct, it comes with factory calibrations to correct a frequency measurement. While MiniDSP H.E.A.R.S. doesn’t create industry-standard results, like in comparative lab tests, it’s pretty close and gives a great idea about how headphones sound in general. Plus, you don’t have perfectly calibrated ears (you don’t, trust me), so it’s impossible to hear the difference. Here’s more on how to understand sound comparisons. The software we use is the popular Room EQ Wizard, or REW in short. REW is a free software for acoustic measurements that has many different functionalities, including measuring speakers and other audio equipment. Multiple measurements ensure more accurate data and show differences between wearing styles. When we take 5 different measurements, we average the responses. The averaged graph gets smoothened by a 1/12, which erases sharp peaks caused by resonance inside artificial ears. How to read a frequency response graph? We use 300Hz, with loudness calibration for different headphones (at 94dB for earbuds and 84dB for headphones) as a baseline before playing a frequency range sweep (aka sine sweep). If you draw a straight line through 300Hz from 20Hz to 20kHz, you get a perfectly flat frequency response. Anything above or below is a deviation from what we call a neutral response. Low frequencies (bass) are on the left side of the graph, mid-range is in the middle, and high frequencies are on the right. We use a regular flat line to represent the ideal frequency response. Other sites resort to the Harman curve or diffuse field curve as their target response, leading to confusion when comparing graphs. What if headphones aren’t flat? Slight deviations in audio make a sound signature more pleasant for the end user since a neutral signature can sound quite dull. Three earbuds, three different frequency responses. However, when going too far, it can mask the whole response. For example, if the bass is too strong, its volume will mask a large portion of the midrange. That could result in a muddy or boomy sound. Especially annoying are peaks around 8-9kHz since those cause sibilance, making your favorite music listening experience unbearable. Why care about the frequency response graph? They’re a great reference when you want to tweak an EQ on your headphones. Especially, if you’re bothered by their sound and don’t know what range is the problem. Also, they show in advance what to expect from headphones before purchasing. You can compare it to a graph of other headphones that you like and see the differences (though graphs from different sources often vary). Making Audio Comparisons Since frequency graphs can be hard to understand, we make audio comparisons between the original track and the headphones we’re testing. We do that by recording headphones on our H.E.A.R.S. while playing a specific song. How do we pick the original track? Finding a proper song that can fully demonstrate the headphone’s capabilities isn’t easy. It has to be packed with different nuances and instruments and use clean, undistorted vocals. What does the audio comparison tell you? When comparing the original song to our headphone recording, you can better understand what parts of the song are altered when using headphones. Consequently, you can better evaluate if those headphones suit you. Why do we have to manually correct the headphone recording? Before you can have a listen to our comparison, we have to correct a recording’s frequency response. That’s because raw audio isn’t calibrated like it is in REW software. Testing true wireless earbuds with MiniDSP H.E.A.R.S. Remember that frequency calibration from before? It tells the software how to correct a recording and how much dB it must boost or reduce a specific frequency. However, Audacity doesn’t read calibration files, so we can’t simply apply it to our recording. Therefore, our raw recording sounds completely off. That’s why we manually correct the recorded song to sound exactly how we hear it from headphones. We use neutral headphones for studio mixing to tweak the frequency response as much as we can. That way, you can hear a real difference in the final comparison. Headphones tend to colorize sound. Therefore, for the best representation of our recordings, we use neutral headphones with minimal colorization. Testing Battery Life Real battery duration is often different from what manufacturers claim. Commonly it’s less than what’s promised. That’s why we test the battery capacity in a common real-life use case. It’s a simple process. We charge headphones to 100% battery, let the music play at 50% volume, and wait for the battery to drain. After writing down at what time we started the test, we play various songs at 50% volume. Why do we test at 50% volume? The reason we stick to 50% volume is that it’s safe for long-term listening and it’s close to what most people listen at. Hopefully, you aren’t blasting your hearing into oblivion with max volume. In that case, expect shorter battery runtime than in test results. If headphones have extra features like active noise cancellation, we enable those features before starting a test. What affects battery life? Battery duration can vary due to: How many features you enable (more features drain power faster, esp; ANC, ambient mode) Temperature (cold temperatures can lower the battery capacity) Volume (higher volume drains more power, that’s why we keep it constant) At high volumes, drivers need more energy to push more air. Low temperatures cause your headphones to drain power faster, so we make sure to test them at normal room temperature around 68°F (20°C). How to Read the Battery Comparison Graphs? Battery comparison graph for ANC earbuds. Every block represents the battery playtime of a specific headphone model. We group together headphones that share specific similarities. From the example above, all of them are ANC earbuds. We try to be as precise as possible when testing the battery, right down to a minute. Since electronic components are susceptible to various factors mentioned above, there might be some differences between our results and battery tests on other sites. There’s also a possibility of a defective unit, we usually buy them like real consumers, instead of getting specific models sent to us by brands. Testing Comfort & Fit We (editors) put the headphones on and wear them for a couple of hours. Often, hours in a row, so we get to see how they behave the longer you wear them. What Categories Do We Look For? To figure out comfort and fit, we ask these questions: How do they feel the first time you put them on? In earbuds, we look for ergonomics: do they fit inside on a first try, do they poke our ears, and do they cause pressure inside the ear canal. Some in-ear headphones are too bulky to fit comfortably (Sennheiser Momentum True Wireless 2 are actually very comfy). Headphones need to have soft earpads with enough space for your ears so they don’t squeeze them. We even examine the manual for instructions on how to properly wear individual headphones (if such instructions exist). True wireless earbuds are specifically tricky when it comes to wearing style and achieving fit. How’s fit and stability at first and after some time? When we plug them in, we shake our head and do a couple of jumps. Later on, we take them on a long walk or run to see how they cope with outdoor activities. A facial expression like a smile can move your ears up, and break the fit of earbuds. How quickly do they start causing wearing fatigue? Some headphones start hurting right away (bulky earbuds), while others may start hurting after longer listening. That’s crucial info if you plan to use them for a couple of hours straight. Do they feel heavy, lightweight, or something in between? Holding them doesn’t tell much. Lightweight headphones are usually the most comfortable. However, heavy ones can still be comfy if the weight is equally distributed. We want to see how comfortable headphones are in different positions. Do they make ears sweaty over time? (over-ear & on-ear headphones) When listening to music, we observe how quickly our skin gets warm and sweaty. Bigger, over-ear headphones, especially the ones with leather earpads, are the hottest. And thus, the sweatiest. Testing Active Noise Cancellation Apart from describing the effectiveness of ANC, we also make a video showing its performance. That way, you have a better understanding of what to expect from tested headphones. Graphs of ANC effectiveness are hard to read and transform into real-life expectations. We believe a more practical example is better. That’s where the ANC test videos come in. How Do We Test ANC? In short, we get 3 results: ANC chart Noise reduction at a specific frequency Average noise cancellation value Plus, to match the human ear sensitivity, we convert SPL dB measurements into A-weighted dBA results. More on this below. Most users use the noise-cancelling feature for daily commuting and traveling. In our ANC headphone test, we use airplane cabin noise as the closest representation of a real-world situation. During airplane flights, cabin noise reaches between 81-85dB of loudness. We measure approximately 85dB at the exact location where our MiniDSP H.E.A.R.S. stands. We measure the loudness at the spot where the measuring rig stands before every ANC test. The loudness can slightly fluctuate, which is normal. Noises in the real world aren’t constant, either. The process When the loudness is right, we place the headphones on our measuring rig. We use Audacity for recording the audio and a camera to record video. First, we measure the SPL in dB: We turn off the ANC to see how much ambient noise passes through the headphones. Then, we enable ANC to see how much ambient noise headphones successfully reduced in dB. If there are different ANC modes, we shuffle through all of them. Here’s a comparison of Sony WF-1000XM5 ANC chart with alternatives. From the measurements, we get dB values of noise reduction across the frequency range from low (100 Hz) to high (15 kHz) frequencies. We create different formats like ANC charts and noise reduction values at specific frequencies in dB, out of which we calculate average noise reduction in dBA (A-weighted values). For each frequency measurement in dB, we apply an A-weighted value according to the IEC 61672 A-weighting values. After converting to A-weighted dBA values, we calculate the average. The average gives us a simple number that’s easy to compare between models and reflects how actual people experience noise cancellation. Let us explain this further. Original Sound Pressure Level (SPL) measurements of noise reduction are in decibels (dB). This tells us how much background noise headphones cancel but doesn’t take into account human hearing. The human ears don’t hear all frequencies equally. In other words, the SPL dB measurements can be misleading and don’t represent the experience one would have in the real world when using the headphones. The human hearing and frequency sensitivity Human hearing is more sensitive to sounds in the range of human speech from 1 kHz to 4 kHz and less sensitive to very low (bass) and very high (high pitch) sounds. This means that headphones might measure a high reduction of low- and high-pitched sounds (100 Hz and 10 kHz) in dB SPL, but this would make little difference in perceived noise reduction by a real human. Why do we convert dB to dBA To more closely match human hearing we convert SPL measurements from dB to A-weighted values in dBA. The conversion implements a filter that “weights” the frequencies by how the human ear hears them. In effect, it reduces the frequencies the ears are less sensitive to and amplifies the frequencies the ears are more sensitive to. It gives the best presentation of how people perceive sound. In the end, we end up with data that more accurately represents the effectiveness of active noise cancelling technology in headphones according to the experience of an average human listener. This helps our readers understand the headphones’ performance in real-world situations. Testing Noise Isolation We test noise isolation in 2 ways. First, we measure the passive noise isolation with our measuring rig. Secondy, we take a more hands-on approach and wear them in loud environments to see how they perform in the real world. Measuring the passive noise isolation is the same process as measuring ANC but without turning on the technology: We establish a constant 85 dB noise and then compare it with the measurements when we put the headphones on the artificial ears. This tells us how much noise headphones block passively in dB (SPL). Then, we apply the A-weighted value according to the IEC 61672 standard to convert dB into dBA data. In the end, we calculate the average of dBA values, which gives us a good representation of how much perceivable noise headphones block. Why do we take headphones in the real world to test isolation? Most people take their headphones outdoors for commuting, travel, or sports activities. Out there, you are exposed to different sounds that vary in loudness and frequency. Also, when you’re moving or talking, the headphone’s fit can change, which affects their ability to block ambient noise. This is not something you can test in a lab setting. Since you can’t fully recreate all those variables with a measuring rig, we prefer to test them in the real world. A real person (reviewer) takes headphones into a loud environment with traffic, trains, people, and other noises. Then writes down how the headphones respond to regular noise pollution. Testing how headphones combat the city environment. How do we rate passive noise isolation? Bad rating: If the environmental sound is too loud and masks what you listen to, passive isolation isn’t sufficient. Average rating: Headphones offer some passive isolation, but isn’t better than average. Good rating: If they effectively block noise from moving cars or construction work, we give them a high rating. Why is good noise isolation important? Noise can mask your music, making it harder to enjoy. Instinctively, you want to crank up the volume, which can lead to hearing damage. Also, good isolation helps with how much sound leaks out. Sound leakage can be bothersome for people around you. Better isolation leads to better bass response, especially in the sub-bass region. Testing Wireless Connection Most modern headphones have Bluetooth. And one of the most annoying things is an unreliable and spotty wireless connection. We test the strength of Bluetooth headphones in everyday situations, so you know exactly what to expect. In this category, we try to mimic real situations. The main questions we ask are: How quickly do walls break the Bluetooth connection? Walls are a problem for Bluetooth. Most headphones work fine without obstacles, but once you put walls in between, the signal starts breaking. And we don’t believe the manufacturers. With Bluetooth 5.0, many headphones don’t deliver the promised 130ft of indoor range. Some even have problems reaching 30ft. That’s why we test it and tell you how it is. We place a smartphone (that uses the latest Bluetooth version) on a table and start walking away, past walls. When the audio becomes unlistenable (constant stutters), we stop and write down if that happened at: 1st wall (bad Bluetooth) 2nd wall (typical Bluetooth) 3rd wall (superior Bluetooth, rare) For all headphones, we visit the same rooms in the same order since we want continuity. What other issues do we look for? Sometimes we notice random audio stutters when reaching for a phone in our pocket. If that happens, we keep a close eye on the issue, even try to recreate it. What about the Bluetooth range? Bluetooth 5.0 has a wide broadcast range (800 feet or 240 meters), which makes the testing range unnecessary. Do headphones support multipoint? We want to connect headphones with a smartphone and a laptop at the same time. If successfully connected, we watch a video on a laptop and call ourselves on the phone. If we can answer a call without problems, headphones indeed support multipoint. What Bluetooth audio codecs do headphones support? Apart from inspecting the packaging and official spec sheet, we also head into “Developer Options” (Android) to see supported Bluetooth codecs. SBC, AAC are common aptX, aptX Low Latency, aptX Adaptive, LDAC, and other proprietary codecs are a bonus Inside Developer options on Android, you can see how many codecs headphones support and select which one to use. How bad is the audio lag when watching videos or playing games? Using both Android and iOS devices, we first play YouTube videos. By focusing on people’s lips, you can tell how severe the audio delay is. Mobile games are an even better way to test audio latency. Playing fast-paced games like Fortnite gives the best results. If headphones support features like Low Latency, we test them both with and without it to see the difference. Standard SBC codec, found in all Bluetooth-enabled devices, works in synergy with YouTube, delaying the audio if necessary to sync it with video. Mobile games aren’t as optimized, so to experience real latency, we test them as well. How much audio lag is detectable by the human eye? ITU (International Telecommunication Union) conducted controlled tests to determine that the threshold of detectability is -125ms and +45ms. For films, the threshold is 22ms in both directions. Testing Durability The build quality is determined by the general feel of the construction, materials, IP rating, and close inspection of weak points. We also examine included accessories and charging cases. Good workout headphones have to survive at least slight splashes of water. Do we perform drop tests? Since we only buy (or get) one review unit, we don’t expose them to extremely harsh conditions. Meaning we don’t purposely hammer and throw them on the floor. Drop tests are also unreliable since one earbud can break much faster than the other of the same model. Cheap plastic on headphones is still cheap plastic, and it won’t last long if they accidentally fall. Therefore, there’s no need for violent drop tests. Are headphones truly water-resistant/waterproof? The official IP rating tells us how headphones handle water exposure. We test those claims. Headphones that claim IP rating up to IPX6, we spray with water (for IPX 4 & 5) or wash under tap water (IPX6). Water sprayed all over headphones simulates sweat and rain. When they dry up, we turn them on (or place them back into the charging case) to see if they still work. If headphones have an IPX7 rating or higher, we submerge them in water. Headphones with IPX7 or more are submerged in water for a minimum of 10 minutes. We fill a large bowl with water and throw them in to simulate an accidental drop. We leave them underwater for at least 10 minutes. Afterward, we repeat the same steps: let the headphones dry and turn them back on to see if they still work. While seams are well-protected against water insertion, nozzles are not. Instead, they only rely on the mesh to have tiny enough holes so that the water doesn’t pass through. Testing Features We dive deeply into companion apps and use extra features extensively to determine, how good they are. After hundreds of hours of testing and using different apps, we have a pretty good idea about what makes the best apps, and which ones need a redesign. What are all the features we look at? Active noise cancelling We rate the ANC quality based on how much ambient noise they reduce. Also, we compare them to other noise cancelling headphones to see which are better. More details in the section above. Ambient mode We focus on how naturally headphones pick up ambient sounds. And does it feel like you don’t wear headphones at all. Equalizer (and personalized EQ feature) First, we shuffle between EQ presets to see if there are any good ones. If the app offers custom EQ, we tweak it based on our frequency response measurement. Sometimes equalization makes headphones sound quieter and less dynamic, so we keep an eye on how well they behave afterward. Some apps have an option to personalize EQ based on hearing capabilities. They play a bunch of different sounds, and you select at what point you hear them. Controls By using only onboard commands, we determine how much you can do without reaching for a smartphone. The more you leave it in your pocket, the better. Also, controls should be intuitive and easy to master. Auditory feedback when pressing commands is a welcome touch. Controls customization We seek complete customization. All commands have to be individually re-mappable. The “Find my earbuds” feature The feature can work in 2 ways. First one involves transmitting loud noise through earbuds so you locate them based on sound. We hide them on a bed or sofa and ask a person next to us to find them based on the sound they transmit. The second one is GPS tracking where the app shows you the last known location of your headphones. Auto play/pause feature When playing music, we take headphones off and listen if they pause the playback. When we put them back on, they should continue playing. If headphones have this feature, we want the ability to turn it off. Along the charging pins, there’s a black dot: a sensor that pauses music playback when you take the buds out of your ears. Virtual surround sound Comparing it to a normal mode, we listen to how wide the soundstage is and how well we can pinpoint the sound’s direction. Features that are specific to a manufacturer (Speak-to-chat, audio processing, 3D audio processing) Based on instructions in the app, we try out different extra features to see how they can benefit the user experience. How good is the microphone in a noisy and in a quiet environment? To test microphone quality, we use a mobile app to record audio from our headphones while performing a phone call. We conduct our test in a quiet room to seek how well headphones pick our voice in ideal conditions. Later, we blast traffic noise to simulate making a phone call near a busy road. Here we want to know how well headphones reduce background noise and keep your voice understandable. Conclusion As you can see, we try to cover every aspect of the headphones that we test. The fact is that every piece matters, and only if headphones deliver on all fronts can they get a high final score. We want to be as critical as we can when it comes to testing. If we see a flaw, we make sure to point it out, no matter how important or minuscule it is. While they’re no ultimate headphones that everyone will like, we want you to get the best headphones deal possible. Do you have any suggestions on what we should also test? Let us know through our contact page. We’ll appreciate your feedback. Peter SusicFrom a childhood fascination with sound, Peter’s passion has evolved into a relentless pursuit of the finest headphones. He’s an audio expert with over 5 years of experience in testing both audiophile and consumer-grade headphones. Quote: “After many years, I can confidently tell which headphones are good and which are terrible.” Find his honest opinion in his reviews.