Always punctual but moving at their own pace, NVIDIA this afternoon wrapped up their 2020 fiscal year with the release of their earnings for both Q4 and the year. For the last quarter of their fiscal year, NVIDIA booked just over $3.1B in revenue with a profit of $950M, marking a strong end to a weaker fiscal year. On which note, for the year NVIDIA will close the books on $10.9B in revenue, for a net income a hair under $2.8B.

NVIDIA Q4 2020 Financial Results (GAAP)
  Q4'2020 Q3'2020 Q4'2019 Q/Q Y/Y
Revenue $3105M $3014M $2205M +3% +41%
Gross Margin 64.9% 63.6% 54.7% +1.3% +10.2%
Operating Income $990M $927M $294M +7% +237%
Net Income $950M $899M $567M +6% +68%
EPS $1.53 $1.45 $0.92 +6% +66%

Beating analyst expectations, NVIDIA closed their year on a relative high note. The $3.1B in revenue they booked was their best quarter in more than a year, blasting past a particular weak Q4’FY19 for a 44% jump in revenue, and even edging out the traditionally strong Q3. Similarly, the quarter was one of the most profitable for the company in quite some time, beating Q4’FY19’s net income by 68%, and leaving the company just a few percent short of claiming a full billion dollars in net income for the quarter.

This profitability is reflected in NVIDIA’s gross margin as well. At 64.9% for the quarter it’s the highest margins NVIDIA has attained in over a year, beating both Q3 and last year’s Q4. And while there’s no strict limitation for gross margins, it’s worth noting that these kinds of margins are close to some of Intel’s best in previous years, which is often used as a barometer for the overall strength of a major chip company.

NVIDIA Quarterly Revenue Comparison (GAAP)
($ in millions)
In millions Q4'2020 Q3'2020 Q4'2019 Q/Q Y/Y
Gaming $1491 $1659 $954 -10% +56%
Professional Visualization $331 $324 $293 +2% +13%
Datacenter $968 $726 $679 +33% +43%
Automotive $163 $162 $163 +1% 0%
OEM & IP $152 $143 $116 +6% +31%

Breaking down their revenue by segment, the big surprise here in NVIDIA’s earnings is data center revenue. At $968M for the quarter, it’s the best showing from NVIDIA’s data center operations since the inception of the current reporting structure, shooting well past the previous record. According to NVIDIA, the company is seeing a surge in demand for AI hardware, which has been a lucrative and rather profitable venture for NVIDIA over the last several years. This growth comes after data center spending (and AI-related spending in general) plateaued a bit over the past year, as it seems hyperscalers and other data center operators have ramped up their overall buying for 2020.

Otherwise gaming remained NVIDIA’s single biggest segment. Like the quarter overall, gaming revenue is up significantly year-over-year, with NVIDIA booking over $500M more than in Q4’FY19. But it’s a bit of a mixed bag overall, as revenue did drop versus the previous quarter, and NVIDIA is well off their Q4’FY18 performance. Ultimately, data center revenue proved to be NVIDIA’s trump card here, helping to cover for any weakness in gaming revenue.

NVIDIA FY2020 Full Year Financial Results (GAAP)
  FY2020 FY2019 Q/Q
Revenue $10918M $11716M -7%
Gross Margin 62.0% 61.2% +0.8%
Operating Income $2846M $3804M -25%
Net Income $2796M $4141M -32%
EPS $4.52 $6.63 -32%

As for the complete, fiscal year 2020 picture, NVIDIA’s Q4 has helped to prop up what has been a profitable but overall weaker year for the company. The $10.9B in revenue that NVIDIA booked for the year is down 7% from the previous year. And net income fell even more sharply, dropping by 32% to $2.976B on the year.

The year-over-year drop has been influenced by several factors, but arguably the biggest is the crypto hangover, which really only ended a bit earlier this year. So the first half or so of the year for NVIDIA is marked by distributors still trying to get rid of excess inventory, as well as the fact that compared to the unbounded spending on crypto gear in NVIDIA’s FY 2019, anything more normal pales in comparison. Coupled with that has been the previously mentioned softness in the data center market, which while not nearly as dramatic as the crypto hangover, saw much of FY2020 data center spending underperforming FY2019 at similar points.

NVIDIA Yearly Revenue Comparison (GAAP)
($ in millions)
In millions FY2020 FY2019 Y/Y
Gaming $5518 $6246 -12%
Professional Visualization $1212 $1130 +7%
Datacenter $2983 $2932 +2%
Automotive $700 $641 +9%
OEM & IP $505 $767 -34%

There had been some concern that the datacenter market had reached saturation – at least for the current generation of products – but following Q4 at least, it looks like that’s not the case. Overall NVIDIA closes out the year up 2% on data center revenue, with the strong Q4 pulling data center revenues up. Gaming doesn’t fare quite so well, as more exposed to the hangover, NVIDIA still end the fiscal year down 12% in gaming revenue versus FY2019.

The big winner here on a pure percentage basis is actually automotive, which was up 9% year-over-year, followed by NVIDIA’s trusty professional visualization group, which was up 7%. The upshot here, at least, is that NVIDIA has long desired to further diversify its business so that it isn’t quite so reliant on gaming revenue, and that’s certainly where FY2020 has taken them.

Finally, looking ahead to FY2020 and Q1, NVIDIA is seemingly projecting with a bit of caution. The company expects to book $3B in revenue, with a gross margin of 65.0%.

The wildcard factor here is the ongoing COVID-19 (coronavirus) outbreak, which along with getting trade shows like Mobile World Congress canceled, could also hurt overall tech spending in China. Officially, NVIDIA has knocked $100M off of their Q1 projections, though this is ultimately a rough estimate as no one is quite sure what to expect. According to the company, China accounts for around 30% of their gaming sales – which is still NVIDIA’s largest segment – so if the COVID-19 outbreak hurts Chinese spending, NVIDIA is likely to feel it in their gaming revenues.

Source: NVIDIA Investor Relations

Comments Locked

27 Comments

View All Comments

  • UltraWide - Friday, February 14, 2020 - link

    Before people ask... It's fiscal year!!!
  • CiccioB - Friday, February 14, 2020 - link

    What is interesting is that they have a fantastic 65% gross margins with dies as big as pizzas.
    It means 12nm yields are very very good and that's probably the main reason Nvidia is not in a hurry to use a more expensive PP.
    Apparently the only advantage this new PP will give them is create somethings even faster that they are producing now, but seen the competitor is a light year behind they can sit and cash in with their better architecture on a old cheaper production node (which by the way still consumes less energy than latest competitor 7nm efforts).
    They will much probably jump to the node after 7nm (7nm+ or 6nm?) with EUV that is going to be better and most of all cheaper in production than 7nm. So they can also spare the cost of the development of a generation of chips which will not last that long.

    AMD needed the 7nm as if they remained at 14nm with RDNA they would have created dies bigger than Nvidia ones with none of the advanced features Turing has and they could have not raised the clock so high to try to close the gap.
    7nm somewhat saved this year GPU sales, but when Nvidia will switch to the 7nm and beyond nodes, AMD products will just show how bad they are despite the advance PP they are based on.
    On same PP the high number of transistor RDNA has, its power consumption and lack of features will just come to light even to those that just look at FPS/$ to evaluate technology and so justify buying years old tech which is just cheap.
  • nt300 - Saturday, February 15, 2020 - link

    You have lots of personal and opinionated assumptions in your post. Would be nice to add in a "In My Opinion" somewhere in your post so people won't take it as factual.

    RDNA took Nvidia by surprise and forced them to release the Super series. AMD products are far from bad, not sure where you come up with claiming AMD products are bad. Where in fact the entire industry knows how overpriced Nvidia's RTX lineup was when launched, which translated to poor sales.
    RDNA is not old tech, is a GPU designed primarily for pc gaming which also includes the GCN instruction set. The 5700XT high end already defeated Nvidia's targeted RTX 2070 as it was intended to do.
    RDNA 2 is said to be a new micro architecture and a next generation graphics competitor.
  • BenSkywalker - Saturday, February 15, 2020 - link

    40% faster then a 2080Ti using 20%-30% less power with a full node process advantage would be an ok part, not great, but decent.

    Where is AMD right now? With a full node advantage the fact that they aren't humiliating nVidia is embarrassing, with a half node edge they should have a decisive advantage, full node their third tier parts should be competing with the 2080Ti.

    The high end is the RTX Titan, then the 2080Ti, then the 2080 Super, then the 2080, then the 2070 Super, then we get to the tier with an AMD competitor *with a full node advantage*.

    Go ahead back through the history of GPUs, given the process advantage you'd be hard pressed to find a weaker design ever than RDNA.

    Right now, the fastest RDNA card still loses to the 1080Ti, a 16 nm part released three years ago. RDNA 2 is supposed to be great, like RDNA was supposed to be great, like ASync compute was supposed to be game changing, like HBM was going to change everything.

    If someone had told me a year ago that AMD would have a full node advantage and not do better than competing against nVidia's fourth tier offerings in traditional rendering without using any die space for ray tracing hardware of day they were trolling, no away their engineers could be that bad.
  • CiccioB - Sunday, February 16, 2020 - link

    There is not "in my opinion" because what I listed are FACTS.

    You are looking to the comparison in the wrong way.
    It's not debatable that a Vega chip is faster than a GP106. And faster than a TU116.
    It's not the absolute FPS a chip creates that makes it good or bad.
    It's the resource it took you to get to those performances.
    What has AMD used to create a chip as fast as the (cut) one mounted on the RTX2070?
    A better PP and as much as the same transistor budget. But, and this but is what discriminates fanboys from those that can understand technology, they lack ALL Turing advanced features (do you know them or you are limited in just knowing only AMD propaganda and so to understand only the old features AMD can show an exploit on the consoles based on its old technology?).
    Have you understood that RDNA has the same features that Pascal had 3 years before?
    You can't compare RDNA to Turing, You have to compare it to old Pascal. Which means AMD is 3 years late.
    Moreover the power consumption is equal if not worst for AMD. And this is by using, if you have still not understood, a completely new production node as advanced as the 7nm vs Nvidia revised 16nm which is now 4 years old.

    Do you really think that obtaining those results with all those resources at play is a good thing that makes AMD products "advanced" and on par with its competitor?
    RDNA is not better than first GCN which was annihilated by Kepler, and started the price raising trend. With chip smaller and less power hungry Nvidia started to drawn AMD that has since hidden their GPU revenues into a general "client market" voice in their financial reports.
    GCN1.1, 1.2, Fiji, Polaris, Vega were all so bad against the competition that the price of the market leading GPUs raised to today maximum (up to know).
    Can you tell us why is that? Because they annihilate the competition? Or because AMD, since 2012, have just been under-pricing their products to try to somewhat sell them putting elephants against mice to show that they can still create competitive GPUs (hiding how much that cost to them?)
    How do you explain AMD 45% gross margin vs Nvidia 65%? Putting in the argumentation the fact that Ryzen is selling like hot cakes?
    How do you judge Vega VII vs Turing? Was it annihilating it? 7nm vs 16nm. No match for Nvidia.
    That doesn't make me really think AMD will be any match to Nvidia 7nm+ solutions as well.
    Nest generation, boys, next generation... 4 generation since GCN was born were still not enough. But the next one.. uhuuhuu.. the next one... how marvelous it will be. Just wait and see.
  • Korguz - Sunday, February 16, 2020 - link

    wow.. gone from opinion to rant...
  • CiccioB - Monday, February 17, 2020 - link

    Rant on facts, my little AMD fanboy that doesn't want (or can't) understand anything without a "proof".
  • Korguz - Monday, February 17, 2020 - link

    sounds more like you are insinuating amd has completely failed with navi so far.. but reviews.. look to say other wise, so who's the fanboy ?? seems to me, your the anti amd fanboy, i believe your other posts on here, have shown that. for one reason or another, you hate amd. but hey.. feel free to keep paying nvidia for their over priced video cards, and intels long in the tooth cpus.....
  • CiccioB - Monday, February 17, 2020 - link

    What you are talking about? Reviews? Based on what? Absolute FPS or FPS/dollar? How old are you to say such a childish think? 15 years old? If you are older, just consider yourself as inadequate to write on a technical forum, please.

    I'm speaking about technology, my dear fanboy, about the sustainability of the choices that AMD has done since the introduction of GCN, which has demonstrated to be a complete fail for their pockets (and the market in general).

    Do they want to create an elephant with the cost of an elephant and the power consumption of an elephant (and the same sh*t amount) and win against a mouse whatever the (kept hidden) costs just to save their face in the market?
    Yes, they can. They can come up with something that has decent performances if they make it large and power hungry enough.
    But if you UNDERSTAND technology, and I see AMD fanboy do not seen the mantra they always put head: FPS are enough, prize is lower, do not care about anything else, AMD product since the introduction of GCN have been miles behind the competitor and the gap has been widening since 2012. Every generation AMD has to put more and more resources into its GPUs to keep Nvidia's pace.
    We are at a point, if you have not understood and just looked at the "reviews" on games performances and stop understanding the world just there, that AMD is using a new advanced (and expensive) PP, same transistor number of Turing (but obviously a smaller die due to higher density, but that is not an AMD feature, its a feature of TSCM's PP and the associated costs in adopting it) which behaves WORSE that the old 1080Ti. A 3 years old card, based on 16nm.
    Yes, with all AMD has put on the table they came up with a decent GPU in power consumption and plain rasterization performances performances (only if used with GCN/console optimized games you know). As I said, just three years later.
    In absolute comparison AMD has a decent GPU on the market (at last, 3 years later) but seen that at the same PP that GPU would be as big as Turing having the features of Pascal.. well, not, it is not a good product that can survive Nvidia next shrink or be anything good if not priced at low level (and see, no isolated GPU data in AMD financial reports.. guess why? Yes, they annihilate the competition! And they can show it! Oh, well, not).

    And, no, I do not hate anyone. They are global companies that do not deserve love or hate. The fact that you love them does not mean that someone that make a criticism just hate you beloved company.
    They have to work the best they can to make the industry go on.
    And, my dear AMD fanboy, AMD simply is not providing any good to the market: prices have been raising since 2012 and their cards are always slower and more power hungry than the competitor. NOTHING good for make the market improve. And actually the reality which you can't understand says so.
    I just realize these simple things and I am not blinded with red glasses that somewhat made GCN in all its form a decent architecture. Well, it is not and we see today and we'll see it even better next generation (the one that will provide an Nvidia killer GPU... how many times have you fanboys said that? BUt see RDNA2 with HW accelerated raytracing will kill Ampere.)
  • Korguz - Monday, February 17, 2020 - link

    gee.. more ranting and even more insults.. and you are calling me the child ?? and me the fan boy ?? come on.. look in the mirror.. posting such long messages is a rant.. and the fact you resort to insults.. grow up.. as i said.. your prevous posts on here show you love nvidia and intel.. so whos the fan boy? and these.. also look and are written more like opinins, while maybe based on some fact.. you dont state where you got these supposed facts.. but what ever.. keep insulting people.. i hope it makes you feel better about your self...

Log in

Don't have an account? Sign up now