With the holidays upon us and money burning a hole in some of our pockets I thought what better time than now to drop some knowledge on you. I don’t know about you but I want to get the absolute most for my money – I think that in itself defines value.
So, you’re in the market for a new monitor? I’m guessing the first question that comes to your mind is “What’s the best monitor?” well that all depends on 1) your intended purpose 2) what device(s) / hardware providing the video source and 3) budget
1) What’s your intended purpose?
- Is this a general-purpose display?
- Is it primarily intended for gaming?
- Is it for work / multitasking?
2) What devices / hardware are supplying signal to the display?
- It makes NO SENSE to buy a 4k display if your hardware can’t effectively push that signal type / resolution to the display. If you have intentions on an upgrade in the future you really need to evaluate when that upgrade will happen realistically. Why? Because these displays will only get cheaper over time.
- PC folk do your research – don’t just take word of mouth results from people regarding their GPU and frames per second results. You can easily google your video card model followed by “gaming benchmarks” and get solid results. For example, a GTX 1070 is likely to give 30 to 45fps @ 4k for most games.
- Console folks I’m sure you heard this magic acronym “HDR” high dynamic range balances the light levels in games, enhancing colors and bringing displays closer to the full spectrum we see with the human eye. If your console / device supports HDR you may want to ensure the display does as well to get the most out of it.
- Last but certainly not least don’t skimp on cables. Think of graphics as data – 4k images have over 8x as many pixels opposed to 1080p. You have two options here HDMI or DisplayPort. Console players I believe you’re limited to HDMI only and let it be known not all HDMI cables are equal! HDMI has many varying specifications ranging from 1.1 to 2.0 To effectively display images @ 4k YOU NEED A MIN OF HDMI 1.4 this supports 4k resolution up to 30hz / fps. PAY ATTENTION HDMI 2.0 supports 4K @ 60FPS. If by chance you want to display 4k at any faster hz / framerate (as of today) you’re only option is DisplayPort! DisplayPort supports 4k @ 240hz/fps.
3) Anytime someone asks me “What’s the best [insert random piece of hardware here]” My first question is always “What’s your budget?”. “The Best” can cost in upwards of $100,000 if you want to get into audio/visual phile territory. Yes, I’ve seen TVs that cost 80k!
- Set a realistic budget that works for you and work within those parameters. If you’re budget is $200 honestly you won’t have a very good 4k experience. Buying knock off china brands won’t do you any favors.
- Remember hardware (especially displays) only get cheaper over time as technology advances.
- Evaluate ways to compromise – would you rather have a smaller display with a better picture quality or a larger display with less picture quality? If you’re in a theater type setup where you’re a good distance away from the screen this may be the option for you. If you’re right on top of the screen gaming maybe the smaller display makes more sense.
Ok let’s talk more about display types. There are two major / more commonly known display technology types.
1) TN (Twisted Nematic) is the most common technology and also the oldest. The main advantage is that it provides the shortest response times, making them good for gaming. In combination with LED back-lighting, TN monitors also offer high brightness and draw less power than competing technologies. Another important factor is that they are cheap to manufacture, resulting in low prices for end users. The drawbacks to the technology is the color shifts that occur at wider viewing angles. There are large differences in quality between different products, but the lower-end ones will exhibit color shift even at moderate angle changes. A TN-based display can usually be identified through these color distortions when viewing the picture from above or from the sides.
2) IPS (In Plane Switching) The main advantages with IPS monitors is that the technology offers noticeably better color reproduction as well as much better viewing angles. The downside used to be a difficulty to emphasize blacks, which in turn meant problems with the contrast. IPS panels were also very expensive and slow in the beginning. Now the manufacturers have started producing so-called Super-IPS (S-IPS) panels at reasonable prices. Response times have crept down considerably and the contrast is much better. In addition, color display and the options to calibrate the colors are superior to the other panel types. IPS panels keep colors constant, even in sharp angles.
In short TN panels will offer the absolute lowest response time and highest refresh rates. However, IPS panels will likely offer a “richer” looking picture at the cost of lower refresh rates and higher response times (but we’re literally only talking a few milliseconds here) due to the display “processing” the image to add this extra quality. Also IPS panels can be "calibrated" this is helpful if you're a graphic artist / designer. Meaning colors you see on screen are better represented as to what you'll see printed out - especially over the life of the monitor with periodic calibration.
Let’s talk response time, refresh rate, and input lag.
1) Response time is a measure of how quickly a pixel can display a change from either black to white or from one shade of gray to another. Lower response times are better. Normal response time right now is 1ms for TN panels and around 4ms for IPS panels. Response time IS NOT the same as input lag! Again, response time refers to the pixels ability to change colors NOT the display’s time (as a whole) to react to an input say from a controller or keyboard/mouse. We will go into this further later.
2) Refresh rate of a display is the speed (rate) at which the display’s image changes (refreshes). The faster the refresh rate, the more times the image can update every second and the smoother the image will appear. This number of changes per second is measured in hertz (Hz). The generally accepted level of refresh rate that leads to a satisfying image depends on the application. Cinemas run at just 24Hz, while the old TV standards of PAL and NTSC ran at 50Hz and 60Hz respectively. A typical PC monitor will have a refresh rate of 60Hz, but the latest gaming displays can reach all the way to 240Hz.
Input lag is the measure of time from when a user given input happens (like a controller button press or mouse click) until its associated action is displayed on the screen. I don’t know of ANY company that measures this unfortunately but I can tell you a few facts about it. For reference old school CRT (analog tube type) displays have ZERO input lag. There is no post processing of the signal whatsoever, what goes in comes right out with an analog display. In fact this is how you can test input lag by comparing the same signal being sent to an analog display vs a digital. Sites like https://displaylag.c...splay-database/
use this method to test displays. Digital displays however, can experience added delay because the image is being processed and/or enhanced before its being displayed. This is more common with TVs and less common in gaming monitors. If you’re gaming on a TV you’ll want to see if your model supports “game mode” this typically turns off any extra image enhancing features to decrease input lag. For home theater setups the number one way to reduce input lag is to ensure you feed the source signal to the display first! DO NOT feed a display source into a receiver and then to the display – you will undoubtedly introduce a significant amount of input lag doing that!
Takeaways here - "ghosting" (the appearance of a ghost or secondary image on screen) can still take place with high refresh rate displays IF the response time is inadequate. Basically the ability to reduce ghosting is a result of BOTH response time and refresh rate.
What is G-Sync / Freesync?
Both of these technologies are designed for using a monitor with a variable refresh rate rather than a fixed refresh rate. G-Sync is NVIDIA’s solution, while FreeSync is AMD’s.
Traditionally, a PC monitor has had a fixed refresh rate, like 60Hz. The display refreshes its image 60 times per second, no matter what. Your PC’s graphics card just continues pushing frames to the display at whatever speed it can, which can result in screen tearing — part of the display showing one frame while another part of the display is showing another frame. This gets worse if your game’s frame rate varies a lot.
V-Sync has been a traditional solution for this, but it has a lot of problems of its own. V-Sync eliminates tearing and makes the image smoother, but it can introduce delay. Rather than sending a frame that would result in screen tearing, V-Sync holds the next frame for a bit, resulting in a delay.
G-Sync and FreeSync introduce variable refresh rates. If your game is rendering at 40 frames per second, your display will update at 40 frames per second. If it starts rendering at 75 frames per second, your monitor will refresh at 75 frames per second. The monitor and graphics processor talk to each other, and the refresh rate constantly changes to be the ideal one to match the images being sent to the display. This eliminates stuttering, input lag, and screen tearing, resulting in a much more fluid image when playing PC games without the problems of V-Sync.
Can you use Freesync with an NVIDIA GPU / G-Sync with an AMD GPU?
Yes, but with caveats. NVIDIA’s G-Sync technology was the first solution. This is a proprietary NVIDIA solution — it requires an NVIDIA graphics processor that supports G-Sync as well as a display that supports G-Sync. Every PC monitor that supports G-Sync includes a proprietary hardware module that talks to the NVIDIA GPU and adjusts the display’s settings on the fly.
AMD’s FreeSync was the second solution. This is AMD’s solution, and it’s not proprietary. Instead, it’s based on a royalty-free industry standard known as DisplayPort Adaptive-Sync. Displays that support FreeSync don’t need a proprietary hardware module, and this makes them a bit cheaper.
Using both together is actually no problem. Your Freesync monitor will indeed work with the NVidia graphics card. You will lose some functionality. You can’t use the active sync feature which maintains the variable refresh rate. Which honestly at that point its providing nothing in addition to what the Vsync option in-game can. It works but I’d recommend matching the tech up with your GPU for the best experience.
I hope this helps you with any future purchases. If you have questions please post them and I’ll add more content addressing them.
Monitor buying guide: https://www.144hzmonitors.com/
these guys do a great job breaking down displays into categories (best under X dollars, best refresh rate, etc.)