A 30-second comparison at your local Microcenter (or other purveyor of high-end monitors) will showcase the ignorance of the above comment.
A flat 34" ultrawide at standard viewing distance will appear to be curved *away* from the viewer (because the edges are noticeably farther away than the center), making text and fine graphic details at the edges blurry. Curved screens solve this problem by keeping the edges at the same viewing distance as the middle.
A curve that is silly and useless on a 55" TV mounted twenty feet away is essential at much closer seating distances (like with this monitor) or screen diameters in tens of meters (IMAX-grade theaters).
Right on, curve on TVs is silly, but on a monitor actually it works great. I also encourage people to go to Best Buy or Fry's (any electronic stores) where they can compare a curved vs. non-curved computer monitor for themselves.
While true, it works only for a single person sitting on screen axis. Then there are other problems - TV cameras are taking flat shots, not curved ones, so watching movie will create image distorsions since it's projected on a curved surface.
Same goes with graphic card 3D rendering. They all render on rectangle and screen edge curving causes distorsions in what you see and what you should see if, let's say the screen was kind of portal to the rendered scene. Only recently nVidia managed to provide rendering for multiple screens each in it's respective angle against the others.
But other than that, if you use ultrawide for office usage, it will certainly be better.
The angle of the screen to you is far more important to the perception of distortion than the relatively small displacement of the edges. On an IPS ultrawide the corners are far enough off axis to get brightness shifts which are highly visible and rather distracting.
I sold my flat ultrawide and have two curved ones, it's essential. At that size, the corners of the display are at a sharp enough angle from the direction you're looking that they're noticeably dimmer.
I'm not sure what would the point of that be? G-Sync monitors tend to be more expensive than freesync due to the g-sync module involved thus those on AMD GPU will always go for cheaper freesync only version!
There is one of those: it's called Freesync; except nVidia won't support it with their GPUs. I'd argue this is asking too much of display companies having to fit the bill for supporting multiple standards and bought by a tiny niche (likely those who are looking for a G-sync monitor); although a cheaper G-sync monitor is probably already available. Regarding the comments below regarding frugal gamers with monitors 10 years monitos I'd suggest the very fact these are price conscious would rule out a more expensive monitor with dual standards than a Free-sync/G-sync only monitor.
You mean the typical gamer is sporting a 19" LCD from 2006 or older? If so you don't have to worry about changing the GPU since popular resolutions back then were 1280x1024 or 1600x900.
Better buy a 3D TV because who knows if all movies are 3D in the future. The point is unless you have a crystal ball predicting tech 5 years from now and trying to pick something "compatible" will prove a fruitless exercise.
I only recently had to get rid of an old gaming monitor from 2004 because it finally wouldn't turn on anymore. Still looked brilliant. Still had a 1ms response time. So yes, you can reasonably expect people to be using decade-old monitors. Because good monitors back then are still good today. It just costs less to get that same quality now.
I have always hated that resolution lol...so arbitrary and pointless. At that point, why you're not just going 1920x1080 I don't know. I like 16:10 over 16:9 too, but not when the former is worse than the latter in every way
3D TV is not a great example here.. With Freesync and G-Sync, we have two competing standards that perform more or less the same function. As a consumer it sucks being locked into just one standard. Yes I know Freesync is an open standard, but I highly doubt Nvidia will ever support it. Down the road when I upgrade my GPU next, I don't want to be forced to stick to one brand to continue getting the extra features of my monitor.
Personally, I have three 1080p monitors that I have used since 2009 while at the same time I have changed GPU's three times in that period - one time switching from AMD to Nvidia. I have been contemplating moving to ultrawide and having both G-Sync and Freesync would be an excellent selling point to me.
Yup, multisync branding long predates Free/G-Sync. I've got a pair of circa 2007 NEC Multisync 2090 (1200x1600) monitors flanking my 30" main screen (a MultiSync 3090).
I wouldn't be surprised if the branding dates back far enough into the CRT era that being able to sync (refresh) at multiple rates (eg 60, 75, 90 hz) was a new feature.
Prepare to be unsurprised: you're exactly right. When PC displays began to move past the constraints of the old NTSC scan rates (since the earliest monitors were just repurposed TV's) in search of higher resolutions, the market became a mess of different standards. You had CGA, EGA, VGA and various third party cards with different scan rates - does anybody remember 1024x768 "interlaced" vs. "non-interlaced"?
For the large part you had to buy a monitor to match your graphics card. Upgrading one meant upgrading the other. NEC saw opportunity in building "multi-sync" monitors that could adapt to the different scan rates. The rest of the industry eventually followed suit.
I used to have a Tatung CRT equivalent that allowed me to run all the weird modes on my Amiga. Fun times.
NEC always kills it in the styling department. Those are downright good looking monitors. Too bad you have to pay $1000 for something that "lacks" style. LG had a few professional looking ultra wide displays but they have all been updated with models that have crappy looking bases or just ugly designs.
Yup. It's a gimmick for sales. Like 3D. I realize there are some that like the feature. If I was a gamer then yeah, I'd be all over the curve. As a content producer I value accurate geometry. I already have an ultrawide monitor. I use a VESA mount to hang the display near me without taking all the desktop space underneath it.
It's gimmicky on TVs, because you're much too far away from them for it to make a difference. But being just a few feet away from a computer monitor, the difference is much more pronounced. Look at one in person sometime. The difference is a lot more significant than you'd expect, and it's precisely because you sit so close to a monitor vs. a TV.
I'm in visual effects, and I find that the curve isn't an issue for content creation - though admittedly, VFX is more about looks than precision.
Rather, the advantage for me is that the curve mitigates the corner distortion that come from being closer to the screen. In doing so, it makes practical sitting much closer to the screen, giving me a much more usefully expansive workspace. I run a 40" curved TV as a monitor and sit back no more than 3 feet from the screen, and in 4k mode I can run Houdini full-screen and use all of it, including the corners. Before that I was an early adopter of a cheap Seiki 39 inch, and at 3 feet away the corners tended to go unused, or I had to translate my head around more than I liked. A 30 inch 2560x1600 flat would have been just as good.
So yes, curves are useless for a TV where the angle of view is less than 20 degrees - but for monitors, in applications that demand a lot of screen real estate, it's great - similar to angled multi-screen setups.
Of course it died for TVs, since nobody sits only 1 meter from it at max. Its more like 3 meters on average. At such distances a curved surface would only benefit on huge TVs, 120" and up, probably more, since I dont really have any idea how big that would be...
For PCs its awesome. It gives benefits in many aspects, like better color in the edges, immersion and space.
I will say this every time I see a monitor promoted/reviewed: The new (finally!) trend in TV's is panel + innards connected by a near invisible cable. That trend makes even more sense for monitors since most users have them connected to a base which could easily hold the electronics and the cable hidden in the arm. I give it 2 years top before all higher end monitors are near bezel free, 3-4 mm thick and curved.
No. Many people actually use a monitor arm to have more space on the desk and be able to pivot them easily or get them closer to your eyes or further away for different applications. For example I like to get the monitor very close for games, and further away for working. With a construction like that, that would be impossible. A VESA mount is a must.
I'm disappointed with the color gamut and refresh rate on this model. For $999+ I'm expecting 10-bit color with at least 75 Hz refresh. I am glad to see another 34 inch option available though! This format seems to be gaining a lot of steam. Still trying to convince my employer to replace my dual monitors with one of these!
Why buy this NEC EX341R instead of the Samsung CF791?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
39 Comments
Back to Article
invinciblegod - Thursday, February 9, 2017 - link
I wish the gaming monitors also have the mst output port.madwolfa - Thursday, February 9, 2017 - link
Curved? Pass.Black Obsidian - Friday, February 10, 2017 - link
A 30-second comparison at your local Microcenter (or other purveyor of high-end monitors) will showcase the ignorance of the above comment.A flat 34" ultrawide at standard viewing distance will appear to be curved *away* from the viewer (because the edges are noticeably farther away than the center), making text and fine graphic details at the edges blurry. Curved screens solve this problem by keeping the edges at the same viewing distance as the middle.
A curve that is silly and useless on a 55" TV mounted twenty feet away is essential at much closer seating distances (like with this monitor) or screen diameters in tens of meters (IMAX-grade theaters).
niva - Friday, February 10, 2017 - link
Right on, curve on TVs is silly, but on a monitor actually it works great. I also encourage people to go to Best Buy or Fry's (any electronic stores) where they can compare a curved vs. non-curved computer monitor for themselves.HollyDOL - Friday, February 10, 2017 - link
While true, it works only for a single person sitting on screen axis. Then there are other problems - TV cameras are taking flat shots, not curved ones, so watching movie will create image distorsions since it's projected on a curved surface.Same goes with graphic card 3D rendering. They all render on rectangle and screen edge curving causes distorsions in what you see and what you should see if, let's say the screen was kind of portal to the rendered scene. Only recently nVidia managed to provide rendering for multiple screens each in it's respective angle against the others.
But other than that, if you use ultrawide for office usage, it will certainly be better.
xthetenth - Monday, February 13, 2017 - link
The angle of the screen to you is far more important to the perception of distortion than the relatively small displacement of the edges. On an IPS ultrawide the corners are far enough off axis to get brightness shifts which are highly visible and rather distracting.Beaver M. - Saturday, February 11, 2017 - link
Not to mention color issues at the edges with all panel technologies are fixed with curved surfaces.xthetenth - Monday, February 13, 2017 - link
This applies to IPS too, the color is okay but the brightness goes first.xthetenth - Monday, February 13, 2017 - link
I sold my flat ultrawide and have two curved ones, it's essential. At that size, the corners of the display are at a sharp enough angle from the direction you're looking that they're noticeably dimmer.NZLion - Friday, February 10, 2017 - link
Is MultiSync just branding? I was really hoping to see someone come out with a monitor that supports both FreeSync and GSyncK_Space - Friday, February 10, 2017 - link
I'm not sure what would the point of that be? G-Sync monitors tend to be more expensive than freesync due to the g-sync module involved thus those on AMD GPU will always go for cheaper freesync only version!Gich - Friday, February 10, 2017 - link
The point being GPU agnostic, that can work with AMD and nVidia.K_Space - Sunday, February 12, 2017 - link
There is one of those: it's called Freesync; except nVidia won't support it with their GPUs. I'd argue this is asking too much of display companies having to fit the bill for supporting multiple standards and bought by a tiny niche (likely those who are looking for a G-sync monitor); although a cheaper G-sync monitor is probably already available.Regarding the comments below regarding frugal gamers with monitors 10 years monitos I'd suggest the very fact these are price conscious would rule out a more expensive monitor with dual standards than a Free-sync/G-sync only monitor.
Murloc - Friday, February 10, 2017 - link
the point is that for the typical gamer a monitor lasts 10+ years, a GPU not more than 5.So you see the conundrum, if you buy now you buy nvidia and g-sync, but who knows if AMD is better in 5 years? And then you're stuck with the monitor.
close - Friday, February 10, 2017 - link
You mean the typical gamer is sporting a 19" LCD from 2006 or older? If so you don't have to worry about changing the GPU since popular resolutions back then were 1280x1024 or 1600x900.Better buy a 3D TV because who knows if all movies are 3D in the future. The point is unless you have a crystal ball predicting tech 5 years from now and trying to pick something "compatible" will prove a fruitless exercise.
pattycake0147 - Friday, February 10, 2017 - link
I still use a 22" from 2006. So...yes 10 yes is reasonable for a monitor in my experience.dstarr3 - Friday, February 10, 2017 - link
I only recently had to get rid of an old gaming monitor from 2004 because it finally wouldn't turn on anymore. Still looked brilliant. Still had a 1ms response time. So yes, you can reasonably expect people to be using decade-old monitors. Because good monitors back then are still good today. It just costs less to get that same quality now.bigboxes - Friday, February 10, 2017 - link
My wife is still using my NEC from 2006. It's 1680 x 1050. Is that good enough rez for you?Old_Fogie_Late_Bloomer - Friday, February 10, 2017 - link
I have always hated that resolution lol...so arbitrary and pointless. At that point, why you're not just going 1920x1080 I don't know. I like 16:10 over 16:9 too, but not when the former is worse than the latter in every waywackyanimation - Friday, February 10, 2017 - link
3D TV is not a great example here.. With Freesync and G-Sync, we have two competing standards that perform more or less the same function. As a consumer it sucks being locked into just one standard. Yes I know Freesync is an open standard, but I highly doubt Nvidia will ever support it. Down the road when I upgrade my GPU next, I don't want to be forced to stick to one brand to continue getting the extra features of my monitor.Personally, I have three 1080p monitors that I have used since 2009 while at the same time I have changed GPU's three times in that period - one time switching from AMD to Nvidia. I have been contemplating moving to ultrawide and having both G-Sync and Freesync would be an excellent selling point to me.
mr_tawan - Friday, February 10, 2017 - link
It was their branding. I think this existed even before GSync and FreeSync.And since G-Sync require a special module, it's probably up to Nvidia whether it will support FreeSync.
mr_tawan - Friday, February 10, 2017 - link
corrections : it *is* (not was)DanNeely - Friday, February 10, 2017 - link
Yup, multisync branding long predates Free/G-Sync. I've got a pair of circa 2007 NEC Multisync 2090 (1200x1600) monitors flanking my 30" main screen (a MultiSync 3090).I wouldn't be surprised if the branding dates back far enough into the CRT era that being able to sync (refresh) at multiple rates (eg 60, 75, 90 hz) was a new feature.
seerak - Friday, February 10, 2017 - link
Prepare to be unsurprised: you're exactly right. When PC displays began to move past the constraints of the old NTSC scan rates (since the earliest monitors were just repurposed TV's) in search of higher resolutions, the market became a mess of different standards. You had CGA, EGA, VGA and various third party cards with different scan rates - does anybody remember 1024x768 "interlaced" vs. "non-interlaced"?For the large part you had to buy a monitor to match your graphics card. Upgrading one meant upgrading the other. NEC saw opportunity in building "multi-sync" monitors that could adapt to the different scan rates. The rest of the industry eventually followed suit.
I used to have a Tatung CRT equivalent that allowed me to run all the weird modes on my Amiga. Fun times.
JohnMD1022 - Thursday, November 29, 2018 - link
That is correct. As I recall, NEC was the first to have this feature.JohnMD1022 - Thursday, November 29, 2018 - link
Indeed it is branding.My first NEC monitor was a MultiSynch purchased in 1986.
Topweasel - Monday, February 13, 2017 - link
Just branding and roughly 20 year old one at that.JohnMD1022 - Thursday, November 29, 2018 - link
As I posted above, my first NEC monitor, a MultiSynch, was purchased 32+ years ago, in 1986.Samus - Friday, February 10, 2017 - link
NEC always kills it in the styling department. Those are downright good looking monitors. Too bad you have to pay $1000 for something that "lacks" style. LG had a few professional looking ultra wide displays but they have all been updated with models that have crappy looking bases or just ugly designs.Gothmoth - Friday, February 10, 2017 - link
LG also does not come close to NEC´s quality.i have two NEC PA272 and they are 95% as good as my eizo CG monitors who cost 80% more.
Gothmoth - Friday, February 10, 2017 - link
curved died for TV... reborn for computers.i pass.
bigboxes - Friday, February 10, 2017 - link
Yup. It's a gimmick for sales. Like 3D. I realize there are some that like the feature. If I was a gamer then yeah, I'd be all over the curve. As a content producer I value accurate geometry. I already have an ultrawide monitor. I use a VESA mount to hang the display near me without taking all the desktop space underneath it.dstarr3 - Friday, February 10, 2017 - link
It's gimmicky on TVs, because you're much too far away from them for it to make a difference. But being just a few feet away from a computer monitor, the difference is much more pronounced. Look at one in person sometime. The difference is a lot more significant than you'd expect, and it's precisely because you sit so close to a monitor vs. a TV.seerak - Friday, February 10, 2017 - link
I'm in visual effects, and I find that the curve isn't an issue for content creation - though admittedly, VFX is more about looks than precision.Rather, the advantage for me is that the curve mitigates the corner distortion that come from being closer to the screen. In doing so, it makes practical sitting much closer to the screen, giving me a much more usefully expansive workspace. I run a 40" curved TV as a monitor and sit back no more than 3 feet from the screen, and in 4k mode I can run Houdini full-screen and use all of it, including the corners. Before that I was an early adopter of a cheap Seiki 39 inch, and at 3 feet away the corners tended to go unused, or I had to translate my head around more than I liked. A 30 inch 2560x1600 flat would have been just as good.
So yes, curves are useless for a TV where the angle of view is less than 20 degrees - but for monitors, in applications that demand a lot of screen real estate, it's great - similar to angled multi-screen setups.
Beaver M. - Saturday, February 11, 2017 - link
Of course it died for TVs, since nobody sits only 1 meter from it at max. Its more like 3 meters on average. At such distances a curved surface would only benefit on huge TVs, 120" and up, probably more, since I dont really have any idea how big that would be...For PCs its awesome. It gives benefits in many aspects, like better color in the edges, immersion and space.
vicbee - Friday, February 10, 2017 - link
I will say this every time I see a monitor promoted/reviewed: The new (finally!) trend in TV's is panel + innards connected by a near invisible cable. That trend makes even more sense for monitors since most users have them connected to a base which could easily hold the electronics and the cable hidden in the arm. I give it 2 years top before all higher end monitors are near bezel free, 3-4 mm thick and curved.Beaver M. - Saturday, February 11, 2017 - link
No. Many people actually use a monitor arm to have more space on the desk and be able to pivot them easily or get them closer to your eyes or further away for different applications. For example I like to get the monitor very close for games, and further away for working.With a construction like that, that would be impossible. A VESA mount is a must.
TEAMSWITCHER - Friday, February 10, 2017 - link
34" for over $1000. You can a 40" Samsung 4K TV with speakers, wifi, apps, and a remote for only $450. The Monitor industry is dead to me..BigDragon - Friday, February 10, 2017 - link
I'm disappointed with the color gamut and refresh rate on this model. For $999+ I'm expecting 10-bit color with at least 75 Hz refresh. I am glad to see another 34 inch option available though! This format seems to be gaining a lot of steam. Still trying to convince my employer to replace my dual monitors with one of these!Why buy this NEC EX341R instead of the Samsung CF791?