Highly Recommended Award
Our Highly Recommended designation is earned by products offering extraordinary value or performance in their price class.
- Laser light source with up to 30,000 hours of maintenance-free life
- Low input lag
- 120Hz refresh rate
- High ANSI lumen output
- Red out-of-the-box is too orange
- Clunky remote design
One of Optoma’s first laser gaming projectors, the GT1090HDR is an improvement over the already good GT1080HDR. The initial cost is higher, but in the long run it’s a better choice for many of the same features combined with a maintenance-free, long-life solid-state light source.

One of the great things about technology marching forward is that once out-of-reach tech makes it into products for the masses. Unlike trickle-down economics, trickle-down technology actually happens. And that's what we get with the $1,399 GT1090HDR—one of Optoma's first laser light source projectors aimed at gamers. The light source has a minimum life of 20,000 hours in Bright mode. That's five times the lamp life of the similarly featured GT1080HDR at full power. And while the GT1080HDR is almost half the price, when you figure in the cost of five $150 replacement lamps, it's basically a wash. Add in higher light output while keeping 4K HDR capability, low input lag, and (with a firmware upgrade this past summer) 120Hz refresh rate at 1080p, and the GT1090HDR looks really good on paper for anyone in the market for a gaming projector.
Features
The biggest change over the GT1080HDR is the light source. Instead of a lamp, the GT1090HDR uses Optoma's DuraCore laser technology. DuraCore is the name Optoma has been using for its laser technology since 2017, originally for their professional line of projectors. It signifies a long-life light source (at least 20,000 hours in Normal and up to 30,000 hours in Eco) and at least an IPX5 rating, meaning the optics are protected from dust. For end users that means minimal maintenance and no need to switch out a lamp every few thousand hours.
The laser produces a published light output of 4,200 ANSI lumens (400 more than the GT1080HDR), which is more than enough for use in a room that isn't fully light controlled. The blue laser passes through a phosphor wheel and a 4-segment RGBY color wheel, then bounces off a single Texas Instruments 0.65-inch 1080p DMD chip. With Optoma's Extreme Black technology (which allows the laser to turn completely off with a full black image), rated contrast ratio is 300,000:1. The GT1090HDR supports HDR10 and HLG high dynamic range, albeit with the Rec.709 color space used for high definition content.

A benefit of the laser light source over a lamp is that the projector turns on quickly—11 seconds from button press to the Optoma splash screen and under 30 seconds until it synced with a 4K source (an Xbox One X in this case). It also turns off in seconds and can be turned on again just a few seconds later, unlike a lamp that needs minutes to cool before turning on again. Efficient cooling keeps fan noise to a quiet 32dB (although fan noise skyrockets in high altitude mode).
The GT1090HDR's short throw lens has a 0.5:1 throw ratio, needing less than four feet to throw a 100-inch diagonal image. The projector lens needs to be a few inches below the screen for proper placement. At the lens is a focus ring lever that has a nice, firm feel to it. There is no optical zoom, but there is a 0.8-2x digital zoom, digital horizontal and vertical image shift (+/- 30 degrees), and auto keystone correction. As always, I strongly suggest getting your projector placement correct and avoiding any digital adjustments as they can add artifacts and affect brightness, unless your setup renders that impossible. For more distance placement information, you can visit ProjectorCentral's Optoma GT1090HDR Throw Calculator.
Gamers will appreciate the low input lag on the GT1090HDR, measured with a 1080p/60 signal at 16.4ms with Enhanced Game Mode enabled (input lag is 49.8ms without). This is comparable to most midrange TVs and fast enough that the vast majority of players won't feel any lag effect. The Enhanced Game Mode can be turned on in any picture mode, not just the one labeled game. After release, there was a firmware upgrade (C13) to add 120Hz support at 1080p.
On the back, there are two HDMI (one 2.0b/MHL 2.2 for 4K input and one 1.4a), VGA in and out, two 3.5mm audio in and one out, a 3.5mm mic in (for use in conference room/business situations), composite video in, a USB power-only port for a streaming stick (5V, 1.5A), RJ-45 Ethernet, RS-232 control, Micro USB for service, and 3D Sync if you prefer to use an additional RF transmitter and glasses for 3D instead of DLP-Link.
The backlit remote is very busy, with many buttons of similar sizes. I found myself having to constantly look down at it. The F3 button is poorly positioned between the directional controls and the Menu button, and my thumb never went down far enough after navigating the menus to reach the Menu button to close them. The backlight is also very bright on first press (it slowly fades out after a few seconds) and took my eyes a moment to adjust each time I looked at it in a dark room. The remote can also act as a laser pointer, which is an interesting novelty for a home theater projector. Optoma thinks it could find use among people working from home.
Key Features List
- 1920x1080 (1080p) native resolution
- Accepts and displays 4K signals at 1080p with HDR
- HDR10 and HLG support
- Maintenance-free DuraCore Laser Light Source
- Up to 30,000 hours in Eco (20,000 in 100% Power mode)
- 4,200 ANSI lumens
- Enhanced Gaming Mode with 16.4ms input lag (at 1080p/60)
- 0.50:1 Short Throw Ratio

Performance
Display Modes and Calibration. The Optoma GT1090HDR has eight different display modes for SDR—Presentation, Bright, Cinema, Game, sRGB, DICOM SIM., User, and 3D (that only enables when the projector senses a 3D signal)—and one each for HDR10 and HLG that enable with an appropriate signal). There are four HDR picture mode settings—Bright, Standard, Film, and Detail. On Optoma projectors from last year, I found these options altered the visible detail in the light and dark extremes of the image and would allow me to adjust them based on the content I was watching. On the GT1090HDR (and a couple other Optoma projectors I've seen recently), the variation between modes was far less apparent than in the past.

The Bright display mode had the highest light output, measured at 3,662 ANSI lumens. As with most projectors in their brightest setting, there was a green tint to the image in this mode although it isn't nearly as objectionable as I usually see and would be acceptable for times you need the extra light output without worrying about color accuracy. Cinema (a far more accurate display mode) dropped the light output to 2,110 ANSI lumens, which is still a significant amount of light for a room that has some ambient light. In a dark room it was even a bit too bright for me, so I would change to Eco mode.
I used Calman calibration software from Portrait Displays, a Photo Research PR-650 spectroradiometer, a VideoForge Classic pattern generator for 1080p SDR signals, and 4K HDR patterns from Diversified Video Solutions. In Cinema, the Warm color temperature setting added a red tint to the grayscale (more so towards the darker end of the curve), but I found it far preferable to the very blue look of Standard (especially as the curve approached 100% white), not to mention the Cool and Cold setting. Out-of-the-box color accuracy was mostly decent, with blue being a bit oversaturated. But red had a visible orange tint, which also affected skin tones. A calibration fixed almost every issue. After calibration, average grayscale DeltaE was 2.4 and color accuracy was 2.3. The color average would have been lower (green, cyan, magenta, and yellow were all 1.2 or below), but there were still issues with red. It was now undersaturated, but also without the orange tint, which lent a much more natural look to flesh tones. HDR, still a bit of an Achilles heel for all projectors, had undersaturated color in all but blue. The EOTF curve also showed that luminance of grays in the midtones (40-70 percent) was too low, though this is common for many projectors.
BrilliantColor, a product of TI's DLP technology, has a default of 10 on a 10-point slider. Higher values add white brightness, giving the image an overall brighter look. Usually decreasing the value improves color accuracy and relative color brightness (at a setting of 10 the color brightness was only 38%), and while it did both of those things it also increased the color temperature and turned whites and grays more blue, almost purpley at a setting of 1. I found 8 to be a good compromise of improved color brightness without the grayscale suffering.
1080p/SDR Viewing. I find that the majority of my 1080p SDR viewing nowadays is on HBO, since they still have yet to keep with the times and improve their streaming bandwidth capability and app support to 4K HDR. The Vow has been the recent obsession, a documentary series about the multi-level marketing company NXIVM and its leader, Keith Raniere, who was convicted last year on sex trafficking and conspiracy charges in relation to a secret society within the company and just sentenced to 120 years in prison. The detail across the entire image was fantastic. Sometimes on short throw projectors I've found that the extreme top corners of the screen can be visibly out of focus, but that wasn't a concern with the GT1090HDR. Cracks in leather couches, stubble on the faces of some men, and the wood grain on the outside of a house were all presented with excellent detail. Skin tones were a little muted, but didn't look unnatural.
Although the detail on the Hulu stream of the television show 12 Monkeys is naturally a bit softer than The Vow on HBO, it was still an engaging watch on the GT1090HDR. The green countryside outside of Prague, where they filmed much of the final season, looked beautiful. And the brightness of the projector stood up to the ambient daylight in my living room, even during darker castle interior scenes. In the 12 Monkeys series, there's an important thematic element of a red forest. The dramatic action still gives it weight, but with the undersaturated reds on the Optoma it's missing the visual pop I'm used to on other displays. I likely wouldn't miss it if I hadn't seen it vividly displayed on other projectors.

HDR Viewing. The measurements might show a bit of a lackluster report on the Optoma GT1090HDR's HDR, but I actually found the experience to be enjoyable and engaging. The darker scenes of Blade Runner 2049 are usually lost on projectors in this price range but, probably thanks to the overall light output of the projector, there's a nice depth of field. Bright highlights were missing a bit of pop, though, that I crave in HDR content and the colors were a little muted.
As mentioned earlier, the four different dynamic range settings are all pretty close to each other—more so than in past years. There are some small changes, primarily in brighter highlights. Using the Bright setting can add some extra punch, but it could potentially blow out bright whites. I found Standard (the default) worked best to my eye for all the material I viewed.
Gaming. Controller responsiveness is very quick with Enhanced Gaming turned on. I had no issues whatsoever battling my way through Dathomir in Star Wars Jedi: Fallen Order, or fighting enemies in Mortal Kombat, or swinging my sword at skeletons in Sea of Thieves. Playing with 120Hz refresh rate on the Xbox One X was smooth and I didn't experience any screen tearing. As with my regular TV watching, the brightness of the GT1090HDR was able to hold up to the ambient light in my living room, making it possible to play pretty much everything without issue. Darker moments caused me to lean forward more into the game to be certain of what I was seeing in the shadows, but I never felt strained.

3D Viewing. When the GT1090HDR senses a 3D signal, a pop-up appears asking if you'd like to turn on 3D mode. From that point forward, the Optoma handled 3D content with no issues. Ant-Man had some really wonderful depth and there was no evidence of ghosting or crosstalk. The brightness of the projector is also a significant boon for 3D content. I always need to be concerned with drawing closed curtains or only watching 3D at night, but the presentation was bright and punchy even with the curtains open. The addition of a 3D Sync connector on that back is a nice bonus for anyone that would prefer to use an RF emitter over DLP-Link glasses.
Conclusion
A couple months ago I got a close look at the GT1080HDR, and it's an excellent gaming projector, especially at $799, but it doesn't perform quite as well for movies because of an elevated black level. The GT1090HDR is better than the GT1080HDR. Is it $600 better? Well, when you factor in the cost of replacement lamps necessary with the 1080, it's absolutely a better buy. To get the most out of it you'll want to get a calibration, but with 20,000 hours ahead of you, it's worth the investment.
The elephant in the room is the imminent release of next-gen consoles and their ability to output 4K at 120Hz into displays with an HDMI 2.1 port. Bleeding-edge enthusiasts will be quick to denounce the Optoma due to its HDMI 2.0 port and 1080p limit. The truth is, at launch there will only be a handful of games that will support 4K/120 (remember, in addition to the console and display, the games need to support it as well). For the majority of gamers embracing the PS5 and Xbox Series X, 1080p at 120Hz will be more than enough, and fit far better into their budget.
For a short-throw gaming projector under $1,500, the Optoma GT1090HDR is going to be hard to beat. It's perfect for a room with some uncontrollable light pollution, and still performs well when the lights go down and the movies start. It's at the top of the list as a home theater gaming projector.
Measurements
Brightness. In the Bright display mode with Brightness Mode at Constant Power 100% and BrilliantColor set to 10, the Optoma GT1090HDR measured 3,662 ANSI lumens—about 87% of the published 4,200 ANSI lumens spec. When switched to Dynamic Black, light output dropped by only 8% to 3,369 ANSI lumens, and only 16% in Eco mode to 3,065 ANSI lumens. Color brightness measured 38% of white.
The remaining display modes measured as follow:
Optoma GT1090HDR ANSI Lumens
Mode | Bright | Dynamic Black | Eco |
---|---|---|---|
Bright | 3,662 | 3,369 | 3,065 |
Presentation | 2,767 | 2,546 | 2,324 |
Cinema | 2,110 | 1,941 | 1,772 |
Game | 2,132 | 1,961 | 1,791 |
sRGB | 1,303 | 1,199 | 1,095 |
DICOM SIM. | 2,701 | 2,485 | 2,269 |
HDR | 2,200 | 2,024 | 1,848 |
Brightness Uniformity. Brightness uniformity measured at 77%. With a full white field, the right side of the screen was a bit dimmer, but any uniformity issues were imperceptible with real-world content.
Fan Noise. The listed noise level for the Optoma GT1090HDR in lab conditions is 32dB. In casual measurements in my room I recorded 34dB sitting about four feet behind the projector and was completely unbothered by the sound. As the image gets brighter and darker on screen, there's no change to the fan noise. Turning on high altitude mode boosts the fan noise significantly to 53dB. At that level, it's a bit difficult to hear quiet dialogue moments.
Input Lag. With a Leo Bodnar 1080p lag tester, I measured 1080p/60 input lag on the Optoma GT1090HDR, with Enhanced Gaming turned on, at 16.4ms. At that speed the vast majority of gamers will be unable to feel any lag from button press to action on screen. The Enhanced Gaming mode can be turned on in any display mode. Outside of Enhanced Gaming, input lag increased to 49.8ms, which is to the point that casual gamers could easily feel the lag and would make playing a game like Mortal Kombat or Overwatch difficult and frustrating.
Connections

- HDMI 2.0b/MHL 2.2 with HDCP 2.2
- HDMI 1.4a
- VGA in/out
- 3.5mm audio in (x2)
- 3.5mm audio out
- 3.5mm mic in
- Composite video in
- USB power (5V, 1.5A)
- RJ-45
- RS-232C
- Micro USB (for firmware updates)
- 3D Sync
Calibrated Settings
Calibrated image settings from any third-party do not account for the significant potential for sample-to-sample variation, nor the different screen sizes and materials, lighting, lamp usage, or other environmental factors that can affect image quality. Projectors should always be calibrated in the user's own space and tuned for the expected viewing conditions. However, the settings provided here may be a helpful starting point for some. Always record your current settings before making adjustments so you can return to them as desired. Refer to the Performance section for some context for each calibration.
SDR
Display Mode: Cinema
Brightness: 1
Contrast: 0
Sharpness: 9
Color: 7
Tint: 0
Gamma: Film
Color Settings:
BrilliantColor: 8
Color Temperature: Warm
Color Matching:
Red
H-21, S21, G27
Green
H-9, S-1, G35
Blue
H-21, S7, G-5
Cyan
H-33, S7, G39
Yellow
H-30, S-3, G5
Magenta
H33, S15, G40
White
Red Gain 8, Green Gain -7, Blue Gain 2
RGB Gain/Bias
Red Gain: -8
Green Gain: 0
Blue Gain: -7
Red Bias: -2
Green Bias: 0
Blue Bias: 0
Brightness Mode: Constant Power 100%
HDR
Display Mode: HDR
Brightness: 0
Contrast: -7
Sharpness: 8
Color: 17
Tint: 0
Color Settings:
BrilliantColor: 10
Color Temperature: Warm
Color Matching:
Red
H0, S0, G0
Green
H0, S0, G0
Blue
H-2, S0, G0
Cyan
H-31, S0, G0
Yellow
H-35, S-5, G0
Magenta
H32, S0, G0
White
Red Gain -2, Green Gain 0, Blue Gain 0
RGB Gain/Bias
Red Gain: -3
Green Gain: 0
Blue Gain: -5
Red Bias: -1
Green Bias: 0
Blue Bias: 0
Brightness Mode: Constant Power 100%
For more detailed specifications and connections, check out our Optoma GT1090HDR projector page.
I didnt need to change the benq colors, however for the gt1090 the red color was too much (i didnt find the red orange like the reviewer did which is strange maybe different unit). I reduced the red to -6 and red bias to -10 (game mode) and -2/-6 (cinema mode). I also increased brightness to +2. I prefer the standard colors (instead of the warm color).
IMO the gt1090 is far superior to the gt1080 thanks to the amazing fast brighter long life laser lamp and the consequently more efficient and quieter fan solution. It is worth the higher price.
Overall i am very happy with my purchase i would have preferred a 4k option hooefully next year they will release the same model with improved red colors and with 4k.
Regarding HDMI 2.1 -- I would hope any new models announced for release late next year will have the updated HDMI port, but that's not based on anything but conjecture. The issue is chip supply and who gets them first, as well as possibly some economics involved. You could expect that a TV maker like LG would have early access because of their flatpanel business and the comparatively low number of chips that would have to be diverted to their projector runs, though Sony's newest models use HDMI 2.0 (I'm a little unclear on what's found in the new Samsung USTs). LG's projector is expected either late this year or early next. But keep in mind that any projector with HDMI 2.1 that's intended for gaming must also enjoy low-enough input lag for that purpose -- it makes no difference if a projector can handle 120 Hz at 4K from a game console if the lag is a drag, so to speak. LG hasn't made any claims to date as to the suitability of its projector for gaming.
https://www.projectorcentral.com/projectors-compare.cfm?pid_1=11180&pid_2=11179
How do you think this projector will pair with the Vivdstorm Ceiling Slimline Tab-tensioned ALR screen?
What are the chances of the GT1090HDR sneaking into any Black Friday deals? I'm feeling like it's unlikely with it being such a new release, but you never know.
Are there any viable alternatives in the short throw (not UST) category?
Thanks!
You can feed it through the monitor output of an HDMI AV receiver.
When the projector is not being used during the day it's pointing directly at the sun for a good few hours. Should we be using some sort of lens cap or cover? I've got a temporary cover for now.
Trying to dial in the calibration settings and the colour option is greyed out, any ideas why this would be the case?
Overall really impressed, loving the fact it works pretty well even in a bright room, but at night it looks fantastic. Games and movies alike.
Not sure which controls you're refering to, certain calibration controls are grayed out, it may be they're just not available in the picture mode you're using. Try one of the other picture modes (which may be less bright out of the box) and see if you can adjust it brighter with its laser output control for the image you want.
"I haven't had first-hand experience with the UHD30, but I've talked to colleagues that have. It's a bit of a tough call. None of the projectors have a great contrast ratio, and that's due mostly to the fact that their black levels are elevated. So I don't expect you'll see a drastic improvement in that realm from any of the projectors. The main benefit you'll have with the UHD30 is the resolution (although it has some image quality issues with HDR). It's colors out of the box aren't quite as good as the 1090HDR and it has almost double the input lag (although it sits around 30ms, so it's still not that bad). Unless you're craving a 4K image, I think the 1090 (especially at $100 less than the UHD30) is a better deal. And you won't have to worry about buying replacement lamps down the line."
Is this a good choice, or should I be looking at something else?
I might be making an error somewhere else in the process but I can’t be sure.
Thanks for pointing out the typo! I've looked back on my calibration numbers for the projector and there weren't any other typos between the writing and the posting (and I no longer have the sample in-house to retest). From what I remember, the red during calibration was exceptionally finicky and just a change of a few notches on the slider could alter the picture pretty significantly. Pairing that with differences from sample to sample (that can sometimes be dramatic), it could account for the not-so-great skin tones you're seeing with my values. As always, these numbers are at best a guide. If they make the picture worse on your model, by all means go with the default settings.
In addition, I have a distance constraint, I have to put it absolutely on the ceiling and I have a distance of 1.5m (5 ft) and I have to cover an area of 130 inches of screen diagonal, so I don't have not much choice of projector. I am annoyed because the projector is only 1080p, and knowing that the future is with 4K, should I wait for the new projectors coming out this year? Are you aware of any other projector that will similarly do the same as the 1090HDR but which will come out in 4K?
I currently have a room with a flight simulator and I use 3 x Optoma EH415ST for each screen, I want to improve further, do you find the 1090 a good choice or I would be better off waiting because more will be released in the next year?
thank you for your valuable advice!
Thank you.
Last week I purchased the gt1080HDR and the lens had a tiny scratch which didn’t seem to affect it, there was no lens cap, and the carrying case was also missing. I figured it wasn’t that big of a deal, so I emailed customer service to inquire about the missing lens cap and carrying case. After sending out the email, I noticed that one of the hdmi ports was super faulty as the device I had hooked up to it kept disconnecting. I decided to return it and figured, while i wait for a refund, I'd read up on what projector to get instead. Despite the poor experience with the 1080, I was heavily leaning towards the upgraded 1090.
All I have to do is drop the 1080 off at the ups store, and then I can get to researching. As I pull into a parking spot, optoma’s customer service emails me. They said that I was wrong about a lens cap and carrying case being included, but I could buy a lens cap for 4 dollars + tax and 12 dollars shipping. They didn’t link the lens cap, and I definitely couldn’t find it on their website. All of this should be moot because at this moment I'm in the process of returning the projector,, but I decided to email them back a screenshot of their website, the place I got my information.
Optoma's webpage for the 1080 states that it comes with a lens cap, and a carrying case, has them listed under the "in the box" section within the specs section. There are also two data sheets available on the page for download, one with a copyright in 2019 and a newer updated one with a 2021 date. The newer 2021 which specifies “Optoma USA” also states that a lens cap and case are "in the box." I sent them a screenshot of the specs section, and they replied saying I was wrong, and told me that website doesn’t state what I sent them, they then sent me the 2019 data sheet which doesn’t mention the lens cap or case. I was flummoxed, here I am on their website reading in plain english what's included, not to mention I purchased the device from a link on that same page. I replied by sending them the updated 2021 data sheet, the one that specifies optoma usa, they haven’t replied. It doesn't matter, the busted 1080 has been sent back, and I for sure have to pass on the 1090 if that is their customer service standard.
Funny projector sites don't even mention the hdr version when reviewing the hdrx. This projector should be less than a 1000$ at this time with old specs.
New dlp chips on 2022 Optoma models provide true 4K and lower latency than the 1090hdr/hdrx for just a bit more than a 1090hdr/hdrx.
They also did it got the 1080hdr eh412st and others. Add an X to the name of an dead projector and raise the prices to 2022 rates.
Wish my 2020 car would be worth more in 2022 than what I originally paid for it.