DJI Phantom 3

Nvidia Share

Embark on a Quest with Nvidia Share

Step into a world where the focus is keenly set on Nvidia Share. Within the confines of this article, a tapestry of references to Nvidia Share awaits your exploration. If your pursuit involves unraveling the depths of Nvidia Share, you've arrived at the perfect destination.

Our narrative unfolds with a wealth of insights surrounding Nvidia Share. This is not just a standard article; it's a curated journey into the facets and intricacies of Nvidia Share. Whether you're thirsting for comprehensive knowledge or just a glimpse into the universe of Nvidia Share, this promises to be an enriching experience.

The spotlight is firmly on Nvidia Share, and as you navigate through the text on these digital pages, you'll discover an extensive array of information centered around Nvidia Share. This is more than mere information; it's an invitation to immerse yourself in the enthralling world of Nvidia Share.

So, if you're eager to satisfy your curiosity about Nvidia Share, your journey commences here. Let's embark together on a captivating odyssey through the myriad dimensions of Nvidia Share.

Showing posts sorted by date for query Nvidia Share. Sort by relevance Show all posts
Showing posts sorted by date for query Nvidia Share. Sort by relevance Show all posts

Huawei's Mate 10 Pro Is Smart Enough To Drive A Porsche


Huawei s mate 10 pro is smart enough to drive a wedge huawei s mate 10 pro is smart enough to drive a ship huawei s mate 10 pro is smart enough to be dangerous huawei s mate 10 pro israel huawei s mate 10 pro isolate huawei s mate 10 lite huawei s mate x huawei s matebook x pro huawei share for pc
Huawei's Mate 10 Pro is smart enough to drive a Porsche


Huawei's Mate 10 Pro is smart enough to drive a Porsche

I'm strapped into the shotgun seat of a Porsche Panamera, parked in a lot just outside FC Barcelona's Camp Nou stadium. My chauffeur for the afternoon: Huawei'sMate 10 Pro smartphone.

You read that right. A phone will be driving this car.

No, Huawei isn't getting into the autonomous car business. The Chinese telecom giant set up this experiment to show off the processing prowess of the flagship phone's Kirin 970 chip, which features an artificial intelligence engine.

"This is purely a showcase of what the phone today is capable of," said Arne Herkelmann, European head of handset portfolio and planning for Huawei.

Alongside buzzwords like 5G and augmented reality, AI stands as one of the key themes for the Mobile World Congress trade show here. The mobile industry has taken its cue from the success of digital assistants like Amazon's Alexa and is touting smarter networks and devices.

huawei3

There's no one in the driver's seat. The phone mounted to the windshield is the only one controlling this car. 

Richard Peterson/CNET

For instance, LG unveiled a revamped flagship called the LG V30S, which added extra memory and AI capabilities. Nokia talked about the role of AI in all the traffic flowing through faster 5G networks. Verizon Chief Technology Officer Hans Vestberg said in an interview that he sees AI -- the power behind computer programs that can learn and adapt on their own -- being useful for detecting and automatically repairing problems with the network.

In November, Huawei unveiled the Mate 10 Pro and the vaunted AI engine in the Kirin 970. The company began selling the phone in the US in February, although without a carrier partner. 

Herkelmann said that since the launch, he'd been inundated with questions about how exactly its AI works. MWC 2018 presented a chance to show off those capabilities. 

Enter the road reader challenge. The company wanted to see if the phone was smart enough to recognize objects like a dog, a soccer ball or a person on a bicycle and tell the car to maneuver away. (Don't worry. The company used cardboard stand-ins.) The engine was fed more than 1 million images and can recognize 1,000 objects.

Autonomous driving, DIY style

Huawei spent five weeks putting this project together -- and you could kind of tell.

There was no polished self-driving car that you would find from Alphabet's Waymo unit or Uber's autonomous fleet. Missing were any sophisticated radars and depth sensors.

Huawei chose the Panamera because it wasn't already a self-driving car. The company's engineers mounted a high-speed camera on the roof, which provided a constant video feed to the phone of everything in front of the car. They also rigged up simple robots to help control the gas, brake and steering wheel.

A developer called Kerve created an app with a simple user interface, allowing you to tap a button on the phone to get the car going.

huawei2

The Huawei Mate 10 Pro uses its artificial intelligence engine to detect objects in front of it. 

Richard Peterson/CNET

I had come into this thinking that the car would go along a curvy track. But instead it was set to accelerate down a simple straight path for roughly 100 feet. Considering how slowly we were going, it almost felt like a waste of a good Porsche.

Herkelmann said Huawei could have taught the phone to drive the car around corners or on different roads, but it would have take more time and space than the company had.

What was it like?

I had a chance to ride through the course twice. The first time was a practice run, in which the car moved at 5 miles per hour. Employees at the other end rushed to the road with cardboard obstacles at random times and locations, and you could see through the app on the Mate 10 Pro that it was able to determine whether the object was a soccer ball or a dog.

Once the car got a few feet away from a dog, it abruptly stopped.

huawei5

The car swerves out of the way of a man and a bicycle (in cardboard form). 

Richard Peterson/CNET

Before beginning the next run, you choose how you'd like to avoid specific objects (swerve to the right, turn to the left or brake).

Fortunately, the second time offered a little more pop. The Panamera jumped to about 30 mph, and when it got close to a man and a bicycle, swerved to the right.

While the experience lacked in thrills, Huawei had made its point. The drive to build autonomous vehicles has chipmakers like Qualcomm and Nvidia offering dedicated processors for the auto industry. Huawei jury-rigged this in a few weeks with an off-the-shelf phone.

That alone is impressive, even if I'll stick to reading emails and posting to Instagram on my phone, and keeping my hands on the steering wheel when I'm in my car. 

Galaxy S9 and S9 Plus : Hands-on with Samsung's iPhone X fighters.

MWC 2018 : All of CNET's coverage from the biggest phone show of the year.


Source

https://nichols.my.id/how-do-you-fix-enamel-hypoplasia.html

.

Intel Shows Off The Chip Tech That Will Power Your PC In 2025


Intel Shows Off the Chip Tech That Will Power Your PC in 2025


Intel Shows Off the Chip Tech That Will Power Your PC in 2025

Intel on Thursday showed a silicon wafer studded with chips built with a manufacturing process that's set to arrive in 2025, a signal intended to reassure customers that the company's years of chip manufacturing difficulties are behind it.

"We remain on or ahead of schedule against the timelines that we laid out," Chief Executive Pat Gelsinger said of the company's plan to improve manufacturing processes. He showed off a gleaming wafer of memory chips built with the company's upcoming Intel 18A process, which overhauls the transistors at the heart of chip circuitry and the way power is delivered to them.

Intel is trying to dramatically accelerate manufacturing progress to meet a 2025 goal of reclaiming the chip performance lead it lost to Taiwan Semiconductor Manufacturing Co. (TSMC) and Samsung. If it succeeds, it'll mean PC chips progress faster after a half decade of lackluster performance improvements. And it could mean Intel becomes more relevant to your digital life by building chips inside your car, phone and gaming PC graphics card.

At the heart of the effort is moving through five new manufacturing processes in four years: Intel 7 in 2021 with the Alder Lake chips now powering PCs, Intel 4 in 2022, Intel 3 in 2023, Intel 20A in early 2024 and Intel 18A in late 2024 -- though the lag between manufacturing availability and product delivery means 18A chips won't arrive until 2025. Showing the wafer is a "proof point" that Intel is on track, Gelsinger said.

Gelsinger, a chip engineer who returned to Intel a year ago, brings tech cred to the CEO job, but it'll be tough for the company to claw its way back. Once a chip manufacturer falls behind the leading edge, as IBM and GlobalFoundries did in recent years, it's harder to justify the colossal investments needed to advance to the new technology.

Embodying Intel's difficulty is Apple's decision to eject Intel Core processors from its Macs in favor of its own M series chips built by TSMC. At the same time, AMD has been gaining market share, Nvidia has been profiting from gaming and AI, and Amazon has introduced its own server processors.

Gelsinger spoke at Intel's investor day, where he and other executives sought to convince often skeptical analysts that the company's enormous spending on new chipmaking equipment will pay off. That will come through premium products and external customers arriving to use its new foundry manufacturing capacity.

Intel 20A introduces two major changes to chip design, RibbonFET and PowerVia, and Intel 18A refines it for better performance. RibbonFET is Intel's take on a transistor technology called gate all around, in which the gate that governs whether a transistor is on or off is wrapped entirely around ribbon-like channels that carry the electrical current.

And PowerVia delivers electrical power to the underside of the transistor, freeing the top surface for more data link circuitry. Intel is playing catch-up with RibbonFET, but it's got a lead with PowerVia, which the industry calls backside power delivery.

Intel is pressing with another lead -- packaging technology that links different "chiplets" into one more powerful processor. The Sapphire Lake member of Intel's Xeon server family arriving this year employs one packaging variety, called EMIB, while the Meteor Lake PC chip arriving in 2023 employs another, called Foveros.

Intel Moore's Law forecast

Intel expects to keep up with Moore's Law, which calls for a doubling in the number of transistors per processor every two years. That'll happen through smaller transistors and new packaging techniques combining multiple "chiplets" into one processor.

Intel

Intel built its first Meteor Lake prototypes in the final quarter of 2021 with the Intel 4 process and booted them up in PCs, said Ann Kelleher, the executive vice president who leads Intel's technology development division.

"This is one of the best lead product startups we have seen in the last four generations of technology," Kelleher said. "Over its lifetime, Meteor Lake will ship hundreds of millions of units, offering the clearest demonstration of leadership packaging technologies in high volume."

Packaging will play a role in future PC processors, including Arrow Lake in 2024, which will incorporate the first chiplets built with Intel 20A. After that comes Lunar Lake, which will use Intel 18A chiplets. Meteor Lake and Arrow Lake will use a new graphics chip architecture that Intel promises will be "a huge step forward," which is important given that graphics chips these days do a lot more than paint pixels on your screen -- for example AI and video image processing.

Kelleher also detailed a host of research and manufacturing changes to prevent the catastrophic problems Intel faced in recent years. For one thing, improvements are now modular, so a problem with one needn't derail others. For another, Intel is developing contingency plans for when problems do arrive. And it's paying more attention to the advice of chip equipment suppliers like ASML.


Source

Tags:

Intel Vs. AMD: Who's Got The Fastest Chip Now?


Intel vs. AMD: Who's got the fastest chip now?


Intel vs. AMD: Who's got the fastest chip now?

Advanced Micro Devices new Trinity chip doesn't deliver the performance trifecta necessary to threaten Intel's market-leading position, according to most initial evaluations.

It's an old story line now: AMD comes out with a new processor that offers better graphics performance, but, overall, does little to change Intel-AMD market dynamics -- which of course heavily favors Intel.

And AMD has done it again. Tapping into the graphics processing unit (GPU) expertise it got when after acquiring ATI in 2006, the Sunnyvale, Calif.-based company continues to ding Intel on GPU performance.

But AMD fails to threaten Intel on central processing unit (CPU) speed and power efficiency.

But don't take my word for it. "AMD's Trinity...doesn't unseat [Intel's] Sandy Bridge from its position of performance supremacy," wrote Tom's Hardware, referring to the Intel chip design announced in January of last year.

Let's insert a quick parenthetical here. Intel is now shipping its next-generation Ivy Bridge chip, and performance will only improve vis-a-vis AMD.

That said, there's plenty of praise for AMD's graphics silicon. Game play is good: AMD's Trinity is recommended "if you're a casual gamer" by Tom's Hardware.

But for higher end games, the advantage isn't necessarily there. "Your best bet continues to be laptops with an Intel CPU and a discrete GPU from Nvidia, at least of the GT 640M level," according to Anandtech.

And note that Intel these days is touting media processing performance for tasks like transcoding: converting a file from one format to another. For example, converting a movie so it is playable on an iPod.

In this area, Intel's Quick Sync is competitive with AMD, said Anandtech.

AMD is making strides with battery life, though. "It's worth pointing out that the concerns about AMD's battery life from a few years ago are now clearly put to rest," Anandtech said.

Then there's the school of thought that Intel needs to be afraid, very afraid. "AMD has a very credible chip on their hands with Trinity, and Intel should be very worried," said chip site Semiaccurate.

But one financial firm is not that enthusiastic. "Advanced Micro Devices'...Trinity seems unlikely to gain share, and will likely compete on price rather than performance against Intel's Ivy Bridge," said MKM Partners in a post on Barron's.


Source

Tags:

Apple's M1 Pro And M1 Max Chips Mean New Trouble For Intel


Apple's M1 Pro and M1 Max chips mean new trouble for Intel


Apple's M1 Pro and M1 Max chips mean new trouble for Intel

A year ago, Apple announced it was taking on Intel's most efficient chips by introducing lightweight MacBook laptops powered by the M1, a homegrown processor. On Monday, the consumer electronics giant expanded its challenge, launching MacBook Pro laptops built around the new M1 Pro and M1 Max that take on Intel's beefier chips.

The new MacBook Pros bode well for Apple's attempt to take firmer control over its products. And they're bad news for Intel, whose chips Apple is ejecting from its Macs after a 15-year partnership. It's a loss of revenue, prestige and orders to keep its factories running at full capacity.

"Intel has completely lost the Mac and is unlikely to regain it any time soon," New Street Research analyst Pierre Ferragu said in a research note Tuesday.

Intel didn't lose this big customer overnight. The company that was once synonymous with consumer computers -- remember Intel Inside? -- fell on hard times because of difficulties upgrading its manufacturing. New CEO Pat Gelsinger has started an Intel recovery plan, including an effort to revitalize manufacturing progress. But turning around a behemoth requires patience. 

Meet the Mac's new chips

Intel's troubles encouraged Apple to develop its own chip expertise and technology for computers. (It already designed its own A-series chips for the iPhone and iPad, and indeed the M-series chips capitalize on that investment.) The company's M1 processors, which came in last year's MacBook Air and low-end 13-inch MacBook Pro, were evidence it wanted to take control of its own future.

The M1 Pro and M1 Max demonstrate the company's increasing power as a chip designer. Both are designed for more capable models, the 14-inch and 16-inch Pros, geared for video editors, programmers and others with intense computing needs. The heft of the chips -- each of which sports eight performance and two efficiency cores, compared with the M1's four-by-four design -- is intended to sustain heavy work. They also come with much more powerful graphics processing power and memory, up to 16GB for the M1 Pro and 64GB for the M1 Max.

Miniaturization is what lets chip manufacturers economically squeeze in more transistors, a chip's electronic circuitry elements. The new M1 models are doozies of miniaturization, with 34 billion transistors in the M1 Pro and 57 billion in the M1 Max. That's how it could add special chip modules for graphics, video, AI, communications and security into its high-end MacBook Pros.

Intel's troubles

Intel, which for decades has led the world in chip technology, suffered for the last half decade as an upgrade to its manufacturing technology dragged on longer than the usual two years. The company's problem came as it tried to move from a 14-nanometer manufacturing process to 10nm, the next "node" of progress. (A nanometer is a billionth of a meter.)

Intel didn't respond to a request for comment. Apple didn't comment for this story.

Apple's chip foundry, Taiwan Semiconductor Manufacturing Co., took advantage of Intel's lag to the benefit of Apple, Nvidia, AMD and other Intel rivals. It now leads in electronics miniaturization and the all-important measurement of performance per watt of power consumed. 

The result is the M1 Pro and M1 Max, which according to Apple's measurements are 1.7 times faster than Intel's current eight-core Tiger Lake chips, formally called 11th generation Core. Compared differently, the M1 Pro and Max consume 70% less power than the Tiger Lake chips at the same performance level.

Apple doesn't reveal which speed tests it uses, so the results are hard to validate at this stage. The consensus, however, is that the performance claims are valid in broad terms.

"I am overall impressed at what Apple has been able to do on the latest process from TSMC," said Patrick Moorhead, analyst at Moor Insights and Strategy. He estimates that Apple saves a few hundred dollars per laptop because it doesn't have to buy Intel processors, although it spends a lot of that money designing its chips.

Don't count Intel out yet

To be sure, Intel won't be hurt badly by the loss of Apple's business. The company has plenty of other business. The vast majority of Windows PCs still use x86 processors from Intel and AMD. And customers only rarely change from Windows to MacOS or vice versa.

It also doesn't have a lot of competition. Apple doesn't license its chips to others, and Qualcomm's efforts to sell processors to PC makers has been a limited success at best. 

Intel mostly has to worry about AMD, which makes increasingly capable chips but still trails in market share.

Intel also has its Alder Lake processor, scheduled for later this year, and Meteor Lake processor, coming in 2023, to generate excitement. The chips will bring speed boosts in part by adopting a combination of performance and efficiency cores, just like the M1 does, and by adopting the new Intel 7 and Intel 4 manufacturing processes.

Still, Apple has taken wind out of Intel's sails. Intel may narrow the gap as its new chips hit the market. But in the meantime, Apple's M series could help it steal market share from Windows computers, Intel's stronghold.


Source

Tags:

Nvidia's $2,500 Titan RTX Is Its Most Powerful Prosumer GPU Yet


Nvidia's $2,500 Titan RTX is its most powerful prosumer GPU yet


Nvidia's $2,500 Titan RTX is its most powerful prosumer GPU yet

Nvidia's Titan cards have always walked a fine line between the gamer-oriented GeForce and the professionally targeted Quadro. They're basically Quadro-power cards with GeForce-capability drivers. That historically plops them into the really, really expensive gaming GPU category or on the lists of video professionals who demand speed and value more than certification.

The new $2,499 Titan RTX, a Turing-architecture-based card that Nvidia announced Monday, adds even more of that power to the mix. It should still appeal to gamers, especially those who want to play Metro Exodus in 8K when it arrives in 2019. But the architecture's optimized ray-tracing and AI-acceleration cores also make it an option for more dataset-focused research, AI and machine-learning development and real-time 3D professional work that doesn't require workstation-class drivers.

The distinction between the GeForce and Quadro cards is waning over time as applications drift away from OpenGL. Adobe's video applications such as Premiere and After Effects, for example, use the CUDA cores directly for acceleration. But Photoshop is still the elephant in that room. You can still only get 30-bit color support (10 bits per channel) with the workstation drivers, which are restricted to Quadro cards.

It's hard to make direct comparisons solely based on specs, in part because Nvidia is inconsistent about the specs it provides at launch. You usually have to wait a little bit until people dig in and ferret them out. 

Most of the specs Nvidia's provided for the $6,300 Quadro RTX 6000 and the $2,499 Titan RTX are almost identical -- the Quadro does have a faster base GPU clock speed and four DisplayPort connectors vs. the Titan's three. So I can't wait to find out what magic the Quadro performs that merits an almost $4,000 premium. Given that neither GPu is shipping yet (the Quadro's in preorder and the Titan is slated for the end of November), we'll have to wait and see.

On the flipside, the less-endowed Quadro RTX 5000 only costs $200 less than the Titan RTX, so you give up quite a bit of power in exchange for those workstation certifications.

As a gaming card, it looks like it'll fit right into its traditional slot as a power bump up from the highest-end GeForce. But unless it delivers a bigger performance gap than the previous generation's GTX 1080 Ti/Titan Xp, it will be doubly not worth it at twice the price of the RTX 2080 Ti. Or it will be, at least, until more games ship which take advantage of its ray-tracing processors.

Comparative specifications


GeForce RTX 2080 Ti (Founders Edition) Quadro RTX 5000 Quadro RTX 6000 Titan RTX Titan Xp
GPU TU102 TU104 TU102 TU102 GP102
Memory 11GB GDDR6 16GB GDDR6 24GB GDDR6 24GB GDDR6 12GB GDDR5X
Memory bandwidth 616GB/sec 448GB/sec 672GB/sec 672GB/sec 547.7GB/sec
GPU clock Speed (MHz, base/boost) 1,350/1,635 1,620/1,815 1,440/1,770 1,350M/1,770 1,405/1,582
Memory data rate/Interface n/a/352 bit n/a/256 bit n/a/384 bit 14Gbps/384 bit 11.4Gbps/384 bit
Texture fill rate (gigatexels per second) 420.2 348.5 509.8 510 379.7
Ray Tracing (Gigarays per second) 10 8 10 11 n/a
RT cores 68 48 72 72 n/a
RTX-OPS (trillions) 78 62 84 n/a n/a
CUDA Cores 4,352 3,072 4,608 4,608 3,840
Tensor Cores 544 384 576 576 n/a
FP32 (TFLOPS, max) 14 11.2 16.3 n/a 12.1
Price $1,200 $2,300 $6,300 $2,500 $1,200

Correction, 12:55 p.m. PT: An earlier headline on this story had the incorrect price for the Nvidia Titan RTX. It costs $2,500.

Fastest gaming laptops, ranked: All the most-powerful gaming laptops tested in the CNET Labs.  

Computers for the creative class: The very best new laptops, tablets and desktops for creatives. 


Source

Tags:

Intel's Core I9-11980HK Leads The Way For Gaming And Creative Laptops


Intel s core i9 11980hk leads the way for gaming and depression intel s core i9 11980hk leads the way intel s core i9 11980hk leads the efforts intel s core i9 11980hk leads the pack intel s core i9 11980hk leadsonline intel s core i9 11980hk leads definition intel s core i9 11980hk processor intel s core values intel share price intel stock forecast intel supplier
Intel's Core i9-11980HK leads the way for gaming and creative laptops


Intel's Core i9-11980HK leads the way for gaming and creative laptops

If it weren't for the ancillary technologies that come with Intel's latest round of Tiger Lake Core H-series CPUs, today's 11th-gen launch could seem like kind of a snoozefest. Yes, these are the first of the high-power mobile gaming-and-creative-targeted CPUs built on the company's 10-nanometer SuperFin process -- tech that essentially improves current handling to deliver improved performance -- led by an always notable flagship i9, the i9-11980HK. Yes, the i9 is faster than its 10th-gen predecessor. And yes, Intel promises that it's faster than AMD's offerings. Frankly, it would be newsworthy if Intel didn't make those claims. But these chips are basically just an expansion of the line Intel rolled out at CES 2021.

As has become habit with mobile processor launches, Nvidia and Intel have been making them in tandem. At the same time as the new Tiger Lake-H series launch, Nvidia revealed its low-end RTX 3050 and 3050 Ti mobile GPUs.   

More of today's news:

That's where Intel debuted the Tiger Lake-H architecture and related process, which (in conjunction with the 500-series chipset) adds support for Thunderbolt 4, Killer Wi-Fi 6E/Gig+, DDR4-3200 memory, dual built-in displays, Optane H20 and 20 lanes of PCIe Gen 4.  

Because the Gen 4 allows direct connection to the CPU rather than using a separate bus, it brings with it a couple of notable capabilities for power users. One is Resizeable BAR, which allows the system to allocate an optimal amount of video memory for the CPU to use for graphics operations not otherwise run on the GPU. That means it takes less time to move the graphics data for rendering out to the display, and can eke out some extra graphics performance. (It's similar to AMD Smart Access Memory, which debuted with the Radeon RX 6000 series desktop cards in October 2019.)  It also lets manufacturers incorporate bootable SSD RAID arrays using Intel's Rapid Storage Technology. So speedier storage in larger capacities.   

Specifications

CPU Cores / threads Cache TDP Base frequency (GHz) Max single core frequency (GHz) Max all core frequency (GHz)
Core i9-11980HK 8/16 24MB 65W 2.6 5 4.5
Core i9-11900H 8/16 24MB 35W 2.5 4.9 4.4
Core i7-11800H 8/16 24MB 35W 2.3 4.6 4.2
Core i5-11400H 6/12 12MB 35W 2.7 4.5 4.1
Core i5-11260H 6/12 12MB 35W 2.6 4.4 4

Just because the processor and chipset support these capabilities doesn't mean you'll see them in all laptops; some of them, such as implementing PCIe Gen 4, are subject to individual manufacturers' preferences and product-line strategies. The i7 and i9 carry on Intel's incorporation of Turbo Boost 3, notable for its automatic selection of the fastest and most reliable core to boost to the max for single-threaded operations.

There are commonalities across all the CPUs, including integrated Intel UHD Graphics. Intel has stressed that the integrated GPU uses its latest Xe graphics architecture, but as with its desktop 11th-gen (Rocket Lake-S) CPUs chose to brand it with the old, old UHD Graphics nomenclature. That's because one of Intel's requirements for it to carry the Iris Xe brand is at least 80 execution units and these H series chips only have 32 EUs. These CPUs are intended for use in laptops with discrete graphics, so that paucity of EUs can be a minor, if irritating, drawback.

Intel also announced its Tiger Lake-H commercial processors, both Core and Xeon, which use the secure, managed vPro chipset.


Source

LG C2 OLED TV Review: Early Favorite For Best High-End TV


Lg c2 oled tv review early favorite for best high end tv on the market lg c2 oled tv review early favorite for best high end tv armoire lg c2 oled tv review early favorite for best high end tv for the money lg c2 oled tv review early favorite for best high school is the lg c2 oled tv worth it lg c2 oled tv best buy 42 inch lg c2 oled tv lg c2 oled oled65c2 lg c2 oled lg c2 review
LG C2 OLED TV Review: Early Favorite for Best High-End TV


LG C2 OLED TV Review: Early Favorite for Best High-End TV

In the last few years LG's "C" series OLED models have risen to the top of my list as the best high-end TV for the money. The C2 is the first 2022 TV I've reviewed, so it's too early to award it that crown, but so far it's the favorite. The C2 offers image quality that's a clear step above any non-OLED TV I've seen, a bigger range of sizes than ever -- including a new 42-inch option -- and a price that's not too steep.

This year, however, the OLED TV competition is tougher than ever. LG's archrival Samsung has an OLED TV too, promising better color with an all-new QD-OLED panel. Sony offers two different kinds of OLED, including a QD-OLED of its own that looks pretty sweet in person. And in 2022 more TV-makers sell mini-LED models, which promise excellent image quality for much less money than OLED.

As is usual in the first half of the year, a new TV's stiffest competition comes from its older self. In my side-by-side comparisons, the C2 and last year's LG C1 OLED TV looked very similar despite the C2's new "Evo" panel, one of the 2022 upgrades LG touts. That's why, if you want a new high-end TV now, you should still get the C1. 

Over the summer the C1 will sell out and the C2 will drop in price, making it more appealing. If you want the best price on a C2 you should hold off until fall, at which point I'll have a much better sense of how the C2 stacks up against its rivals. It's off to a good start though. 

LG C2 sizes, series comparison

I performed a hands-on evaluation of the 65-inch OLED C2, but this review also applies to the other screen sizes in the series. All sizes have identical specs and, according to the manufacturer, should provide very similar picture quality. The exceptions are the 42- and 48-inch sizes, which lack the "Evo" panel and might be slightly dimmer than the others as a result (although the difference is minimal, if my comparisons to the non-Evo C1 are any indication). 

The C2 series sits in the middle of LG's 2022 OLED TV lineup, with the widest range of screen sizes and all the features I expect from a high-end TV. Spending more for the G2 gets you a slightly brighter panel according to LG, as well as the wall-friendly "gallery" design. The less-expensive A2 lacks the HDMI 2.1 gaming features, 120Hz refresh rate and fancier processing found on the other 2022 LG OLEDs. 

David Katzmaier/CNET

Lighter weight, nearly all picture

The C2 is a very nice-looking TV, with a minimalist appearance similar to past LG OLEDs, but the company made some changes for 2022. When a colleague and I set it up, we actually felt the first such change: it's lighter than the C1 by a noticeable amount, up to 47 percent lighter depending on size. The 65-inch version I reviewed weighs just 37 pounds with its stand, compared to 72 pounds for the 65-inch C1. 

New carbon-fiber materials are responsible for the reduced weight, according to LG, and I noticed it on the TV's backside. The edges of the panel are slightly more squared-off as well. I also appreciated the narrower bezel, 6mm slimmer than the C1, leading to even more of an all-picture look, although if I didn't have the two TVs side-by-side I probably wouldn't have noticed. The stand has a much smaller footprint than last year and raises the panel a bit more over the table, both improvements in my book.

David Katzmaier/CNET

LG kept the same remote, unfortunately. In my old age I've grown easily annoyed by too many buttons, and I much prefer the streamlined, simple layout of Samsung and Roku/TCL remotes, for example. As always, you can wave LG's remote around to move the cursor, or scroll quickly through menus with the built-in wheel.

Smart TV, crowded menu

LG's WebOS menu system is not my favorite, in part because of the clutter. You'll see notes and notifications along the top, a box that displays the weather, a prompt to sign in to LG's system, a seemingly random collection of stuff labeled "Trending Now," then (finally) the list of apps below. Signing in unlocks a new 2022 feature, customized recommendations and additional user accounts. LG touts the fact that you can set up favorite sports teams, for example, but most people will just go straight to the app and skip the clutter. As usual, I prefer a simpler interface like Roku, and if you like customizations and options Google TV is a better bet. On a TV this expensive you should just attach a good streaming device instead. 

David Katzmaier/CNET

Also new for 2022 is something LG calls "always ready." Instead of turning the screen off when you press power, the TV displays your choice of art wallpapers, a clock, "sound palette" art or your own custom photos. Designed for people who would rather have something on their big screens rather than a big black rectangle, it's similar to the ambient mode Samsung TVs have offered for the last few years. Personally I'd rather save the power, so I'd leave this feature (and my TV) turned off.

The elements of the always-ready feature and LG's screensaver move around so as not to risk burn-in. Here's where I remind you that, like all OLED TVs, the C2 is more subject to both temporary and permanent image retention, aka burn-in, than LCD TVs. The risk is small, which is why I don't consider burn-in a reason for most people to avoid buying an OLED TV. Check out our guide to OLED burn-in for more.

The new "always ready" feature puts something on the screen even after you turn it "off."

David Katzmaier/CNET

LG also added a new multiview feature that puts two sources side by side or picture-in-picture, but unfortunately it's quite limited. You can't show two HDMI inputs on-screen and the main thing you can do -- share a screen from your phone side-by-side with an input -- didn't work with Apple AirPlay. LIke most TVs, the C2 does support Apple's phone-mirroring feature, and it also lets you issue Google Assistant or Amazon Alexa voice commands by speaking into the remote or, new for 2022, hands-free when you say the wake word like "Alexa."

Well-connected, especially for gamers

LG continues to excel at connection options. All of LG's 2022 OLED models (aside from the A2) include the latest version of the HDMI standard: 2.1. That means their HDMI ports can handle 4K at 120 frames per second and variable refresh rate (including Nvidia G-Sync and AMD FreeSync), as well as enhanced audio return channel and automatic low latency mode (auto game mode). In other words, they can take advantage of the latest graphics features available from PlayStation 5 and Xbox Series X and S consoles as well as high-end graphics cards. The C2 is rare among high-end TVs in that all four of its HDMI ports support 4K/120 -- great for hard-core gamers with multiple next-gen devices. 

  • Four HDMI inputs with HDMI 2.1, HDCP 2.2
  • Three USB 2.0 ports
  • Optical digital audio output
  • RF (antenna) input
  • RS-232 port (minijack, for service only)
  • Ethernet (LAN) port

All four of the C2's HDMI inputs support HDMI 2.1 features.

David Katzmaier/CNET

LG OLED C2 picture quality comparisons

My side-by-side comparisons involved the best TVs I had on-hand, but the only other OLED was the LG C1 from last year. Since it's early in 2022, the C2 was the only current model-year television in the group – I'll compare it to other 2022 TVs as soon as I get the chance. Here's the lineup:

TV and movies: The LG C2 has a spectacular picture but watching it next to the C1 from 2021, any improvements were really tough to see. And measurements backed up my initial impressions: Both TVs delivered essentially equal numbers, and both were extremely accurate in their best modes. Both outperformed the TCL TVs in my comparison overall, as expected.

The comparison lineup with the LG C2, center, on the gray TV stand and the C1 to its right.

David Katzmaier/CNET

I started my comparison with familiar (to me) high dynamic range material, namely the demo montage from the excellent Spears & Munsil HDR benchmark 4K Blu-ray. Both OLEDs showed equally pleasing images. The perfect black levels and lack of blooming (stray illumination) in areas like the honey dripper and cityscapes created superior punch to the LCD-based TCLs. And while the snowscapes, deserts and other full-screen bright scenes from the TCL TVs outshined the OLEDs, smaller highlights in areas like the ferris wheel at night were actually brighter on the LGs. Spot measurements using a light meter revealed the C2 as being slightly brighter than the C1 on the ferris wheel, but with the naked eye I couldn't really see the difference. I also saw more saturated, natural color on the LGs, in particular reds like the strawberries and flowers.

Switching to TV content, I put Severance from Apple TV Plus on all four sets and the results were similar. During Helly's brain surgery in Episode 2 the dark areas looked more true and realistic on the OLEDs, without the blooming -- in the letterbox bars near the operating lights, for example -- I saw on the TCLs. The brightness advantage of the LCDs in the office training scene later was obvious, but the faces of Mark and Helly looked flatter and less defined. Again, however, the C1 and C2 were very difficult to tell apart.

The new overlay for Game Optimizer shows vitals like frames per second and variable refresh rate, at a glance.

David Katzmaier/CNET

Gaming: As with nongaming content, the OLEDs looked better than the LCDs in my side-by-side comparisons, although the two LGs again looked very similar. The C1 was my favorite gaming TV last year, and the C2 improves it just a bit. 

LG's Game Optimizer mode offers myriad adjustments and the updated overlay menu surfaces them in a more logical way, putting VRR next to FPS and offering a few more shortcuts on the bottom, including to the new Dark Room mode. That mode dims the image and is designed to reduce eyestrain, but even though I game in the dark a lot, I don't have much use for it. Playing Horizon Forbidden West in HDR on PS5, for example, Dark Mode made the moonlit forest less dazzling and the mountain snowscape duller, but if you're someone who's bothered by bright sequences in games it might be useful.

A new Sports mode joins the litany of picture modes, but as I found last year, I liked Standard best for most games with its balance of shadow detail and contrast. FPS is best if you want more visibility into shadows, or you can just crank the Black Stabilizer control up (at the expense of a washed-out image). I appreciate the separate adjustments just for gaming, which most other TV makers don't have.

The full Game Optimizer menu shows even more options.

David Katzmaier/CNET

Buried within Game Optimizer is another setting labeled "Reduce input delay (input lag)" with two options, Standard and Boost. The former, which is the default for any game, serves up an excellent input lag result similar to past LG OLED models: just 13.5ms for both 1080p and 4K HDR sources. Engaging Boost cuts lag even further, to just under 10ms for both. The catch is that Boost is only available for 60Hz sources, so you can't use it with 120Hz games or VRR. And no, I don't think many humans would notice the extra 3ms of lag.

Bright lighting: Although LG touts the C2 as 20% brighter than non-Evo OLED TVs like the C1, my measurements didn't back that claim up. Yes the C2 was a bit brighter, about seven percent on average, but the difference wasn't visible in just about anything I watched. In my experience those differences are slight enough to vary from sample to sample.

Below are my measurements in nits for select comparison TVs in their brightest and most accurate picture modes, using both standard dynamic range (SDR) and high dynamic range (HDR) test patterns.

Light output in nits

TV Brightest mode (SDR) Accurate mode (SDR) Brightest mode (HDR) Accurate mode (HDR)
Hisense 65U8G 1,619 1,612 2,288 2,288
Samsung QN65QN90A 1,622 1,283 2,596 1,597
TCL 65R635 1,114 792 1,292 1,102
Sony XR65X90J 951 815 945 847
LG OLED65C2 413 389 812 759
LG OLED65C1 409 333 790 719

The C2 is plenty bright enough for just about any viewing environment, but as usual it's not nearly as bright as competing LCD-based models. As with most TVs, the brightest mode for HDR and SDR (Vivid on the C2) is horribly inaccurate. For the accurate results listed above on the C2, I used ISF Expert Bright picture mode (Peak Brightness: High) for SDR and Filmmaker mode for HDR. I recommend C2 owners do the same to get good color in bright rooms. Note that with SDR, you'll need to disable the Auto Energy Saving setting (Support > Energy Saving > Energy Saving Step > Off) to get full brightness.

The screen of the C2 was excellent from off-angle but didn't seem to reduce reflections quite as well as the C1.

David Katzmaier/CNET

Like all OLED TVs, the C2 gets quite a bit dimmer than LCDs when showing full-screen white -- a snow field, for example -- but even in those situations it's hardly dim. The C2's screen finish was excellent at preserving black levels, better than the TCLs' more matte finishes, which beat both LG's at rejecting reflections. The screen of the C1 seemed slightly more reflective than the C2, but the difference was minimal.

Uniformity and viewing angle: Like all OLEDs I've tested the C2 was exemplary in this area compared to LCD-based TVs, with no significant brightness or color variations across the screen and nearly perfect image quality from off-angle. Comparing the C2 and C1 I saw a very slight color shift toward blue and magenta on the C2 that wasn't visible on the C1, something that could be caused by the new Evo panel structure. It was only visible from very extreme angles, however, and has no real impact.

The C2 has myriad picture settings, but if you just want to set it and forget it, use Filmmaker Mode.

David Katzmaier/CNET

Picture setting notes

The most accurate settings were Cinema and Filmmaker mode for both HDR and SDR, as well as the two ISF modes available in SDR. For SDR viewing I went with Cinema for dark rooms (because it was closer to my 2.2 gamma target) and ISF Bright for brighter environments, and for HDR I used Filmmaker (which was very slightly brighter than Cinema HDR). Game Optimizer is best for gaming, thanks to its processing, but quite blue; for the best color accuracy for gaming you should adjust the color temperature control all the way toward red (Picture > Advanced Settings > Color > White Balance > Color temperature > Warm50).

Like most TVs the C2 offers settings that engage smoothing, aka the soap opera effect, as I prefer to turn it off for TV shows and movies (and it's off in Game Optimizer mode because it increases input lag). You can experiment with the settings (Picture > Advanced Settings > Clarity > TruMotion) and it's off by default in the Cinema and Filmmaker modes.

Geek box

SDR Result Score
Black luminance (0%) 0.000 Good
Peak white luminance (10% win) 389 Average
Avg. gamma (10-100%) 2.16 Good
Avg. grayscale error (10-100%) 1.34 Good
Dark gray error (30%) 0.67 Good
Bright gray error (80%) 1.66 Good
Avg. color checker error 0.95 Good
Avg. saturation sweeps error 1.00 Good
Avg. color error 0.81 Good
Input lag (Game mode) 13.47 Good



HDR10

Black luminance (0%) 0.000 Good
Peak white luminance (10% win) 759 Average
Gamut % UHDA/P3 (CIE 1976) 99.62 Good
ColorMatch HDR error 5.93 Poor
Avg. color checker error 2.94 Good
Input lag (Game mode, 4K HDR) 13.47 Good

See How We Test TVs for more details.

Portrait Displays Calman calibration software was used in this review. 


Source

Search This Blog

Menu Halaman Statis

close