Pull to refresh
Logo
Daily Brief
Following
Why
Nvidia Corporation

Nvidia Corporation

Technology Company

Appears in 11 stories

Stories

AI models learn to read, predict, and write the genetic code of life

New Capabilities

The dominant maker of graphics processing units (GPUs) used in AI training, NVIDIA provided the computing infrastructure and engineering collaboration for Evo 2. - Infrastructure partner and co-developer of Evo 2

It took thirteen years and $2.7 billion to read the first human genome. Now a single AI model, trained on 9 trillion DNA base pairs from more than 128,000 species, can predict whether an uncharacterized mutation in a breast cancer gene is dangerous—with 90 percent accuracy—without ever being shown that gene. On March 4, the Arc Institute and NVIDIA published Evo 2 in Nature, the largest biological foundation model ever built: 40 billion parameters, a context window of one million nucleotides, and the ability to design synthetic genomes the size of a simple bacterium.

Updated 5 days ago

The race to replace copper inside AI data centers

New Capabilities

The dominant supplier of AI accelerator chips, NVIDIA is both a customer for optical interconnects and a developer of its own co-packaged optics networking platforms. - Strategic investor in Ayar Labs; developing own co-packaged optics platforms

Every time engineers double the data rate on a copper wire, electrical noise doubles too, cutting the usable cable length in half. That physics problem is now strangling the AI industry. As graphics processing units (GPUs) push toward 224 gigabits per second per lane, passive copper cables inside data centers can reach less than one meter before the signal degrades. Ayar Labs, a startup born from research at the Massachusetts Institute of Technology (MIT) and the University of California, Berkeley, just closed $500 million in Series E funding at a $3.75 billion valuation to mass-produce chips that replace those copper links with light.

Updated 7 days ago

Big tech's half-trillion-dollar AI bet

Money Moves

The primary beneficiary of hyperscaler AI spending, controlling 92% of the discrete GPU market for data centers. - Secures photonics supply chain with $4B investments amid hyperscaler capex surge

The four largest cloud providers—Microsoft, Meta, Alphabet, and Amazon—guided to over $650 billion in combined AI infrastructure spending for 2026 during their February earnings reports, up sharply from $350 billion in 2025, and have begun tapping debt markets to fund the buildout. Microsoft and Meta reported on January 28-29 with divergent market reactions: Microsoft shares plunged 12% on $37.5 billion quarterly capex, while Meta surged on $115-135 billion 2026 guidance. Alphabet stunned investors February 4 with $175-185 billion capex plans—doubling last year's spend—while Amazon topped all on February 5 with a $200 billion pledge, 50% above 2025 and $50 billion over expectations, prompting a share selloff despite strong revenue beats.

Updated Mar 2

Autonomous machines move from mines to construction sites

New Capabilities

A leading designer of graphics processing units and AI computing platforms, now supplying the Jetson Thor hardware that enables real-time AI processing directly on Caterpillar's machines. - Technology partner powering Caterpillar's edge AI and autonomy systems

For more than three decades, giant autonomous trucks have hauled billions of tonnes of rock out of mines with no one behind the wheel. Now Caterpillar, the world's largest construction equipment manufacturer, is moving that technology to ordinary job sites. At CONEXPO-CON/AGG 2026 in Las Vegas, the company ran a 26,000-pound CS12 soil compactor through live compaction passes with an empty cab—the first time a major equipment maker has demonstrated this level of autonomous operation on a construction machine in a public setting.

Updated Mar 2

OpenAI assembles record private funding round

Money Moves

The dominant supplier of graphics processing units used to train and run artificial intelligence models, Nvidia's chips power the majority of the world's AI infrastructure including OpenAI's data centers. - Finalized $30B equity investment and committed to providing 5GW of Vera Rubin capacity to OpenAI

In October 2024, OpenAI raised $6.6 billion at a $157 billion valuation. Seventeen months later, on February 27, 2026, the maker of ChatGPT closed a record $110 billion funding round at a $730 billion pre-money valuation ($840 billion post-money)—the largest private capital raise in history. Amazon led with a $50 billion commitment ($15 billion upfront, $35 billion contingent on OpenAI achieving AGI or completing an IPO by year-end), while Nvidia and SoftBank each committed $30 billion. The round remains open for additional investors. The deal includes expanded infrastructure partnerships: Amazon will provide $100 billion in additional AWS compute services over eight years (on top of the existing $38 billion commitment), while Nvidia will supply 3 gigawatts of dedicated inference capacity and 2 gigawatts of training capacity using its Vera Rubin systems.

Updated Feb 27

The race to build AI's physical foundation

Built World

Semiconductor company that became the dominant supplier of AI training and inference chips. - Dominant AI GPU supplier facing Meta AMD diversification, $51.2B datacenter revenue Q3 2025

ChatGPT's November 2022 launch triggered the fastest infrastructure buildout in tech history. Datacenter construction spending tripled from $15 billion to $45 billion annually in just two years. Hyperscalers are now on track to spend over $1 trillion in 2026—exceeding the GDP of all but 10 countries—racing to secure power, land, and cooling systems before their rivals. Alphabet shocked markets on February 4, 2026 with guidance of $175-185 billion in 2026 capex, 55-65% above Wall Street estimates of $119.5 billion. Amazon escalated the spending war on February 5 with $200 billion 2026 capex guidance after Q4 revenue of $213.4 billion and AWS growth of 24% to $35.6 billion. Microsoft reported $37.5 billion in capex for Q2 FY2026 (just one quarter), while Meta committed $6 billion to Corning for fiber-optic cables in late January, secured 6.6 gigawatts of nuclear power through three partnerships announced in early January 2026, confirmed a multi-billion Nvidia chip deal, and on February 24 announced a $60-100 billion, 6-gigawatt AMD GPU deal—diversifying away from Nvidia dominance.

Updated Feb 24

AI chip testing becomes a strategic bottleneck

New Capabilities

Dominant supplier of AI accelerators whose production volumes directly drive demand for Advantest's test equipment. - Primary driver of AI chip testing demand

Advantest, a Japanese company most people have never heard of, just posted record quarterly sales—and its stock now moves in near-lockstep with NVIDIA's. The reason: every advanced AI chip must pass through test equipment before it ships, and Advantest controls nearly 60% of the global market for the machines that do this. As AI spending explodes, chip testing has quietly become one of the supply chain's tightest chokepoints. Yet the company faces intensifying competition: U.S. rival Teradyne is gaining ground in memory testing, and the entire semiconductor equipment sector is experiencing unprecedented demand as chipmakers race to expand capacity for AI accelerators and high-bandwidth memory.

Updated Feb 4

China's $1.2 trillion pivot

Money Moves

Leading AI chip manufacturer navigating U.S.-China tech restrictions. - Subject to new semiconductor export rules

China posted a $1.2 trillion trade surplus for 2025—the largest any country has ever recorded. The number is roughly equivalent to the GDP of Indonesia, the world's 16th-largest economy. It comes after seven years of U.S. tariffs designed to shrink that very surplus, and eight days after Canada struck a deal with Beijing that slashed Chinese EV tariffs from 100% to 6.1%, marking a dramatic shift in Western trade policy toward China that prompted Trump to threaten 100% retaliatory tariffs on Canadian goods.

Updated Jan 30

The packaging pivot: why AI's real bottleneck isn't chips—it's putting them together

Built World

The world's largest AI chip company and dominant buyer of advanced packaging and HBM capacity. - Primary customer driving HBM demand; claimed exclusive HBM4 access through 2026

For decades, chip packaging was the unglamorous final step—stacking and connecting silicon dies after the real engineering was done. Now it's the constraint holding back AI. SK Hynix announced a $12.9 billion investment to build the world's largest advanced packaging facility in South Korea, a bet that the company controlling 61% of the high-bandwidth memory market can't afford to lose its lead as competitors circle. At CES 2026, the company unveiled the first 16-layer, 48GB HBM4 module—double the capacity of current generation memory—requiring silicon wafers thinned to just 30 micrometers, thinner than a human hair.

Updated Jan 15

Nvidia's $20 billion Groq deal: the AI inference land grab

New Capabilities

Controls 90%+ of the AI chip market through GPU dominance and the CUDA software moat. - Acquiring Groq's assets and team for $20B

On Christmas Eve 2025, Nvidia paid $20 billion for Groq's assets—nearly triple the AI chip startup's $6.9 billion valuation from three months earlier. The deal brings Groq's founder Jonathan Ross, who created Google's original Tensor Processing Unit, and his breakthrough inference technology into Nvidia's fold. It's Nvidia's largest acquisition ever, nearly three times bigger than its $7 billion Mellanox purchase. By structuring the deal as a "non-exclusive licensing agreement" rather than an outright acquisition, Nvidia bypasses Hart-Scott-Rodino Act merger review requirements that trigger automatic FTC scrutiny—following Microsoft's 2024 playbook with Inflection AI. The deal's unusual structure has drawn immediate analyst warnings about "the fiction of competition" as Groq's leadership and technical talent move to Nvidia while the company nominally continues independently. Adding to the intrigue: 1789 Capital, where Donald Trump Jr. serves as partner, was among Groq's September investors who saw their stake nearly triple in just three months.

Updated Dec 27, 2025

Trump reopens China to Nvidia’s H200—now Congress wants the national-security math

Rule Changes

Nvidia is the AI era’s arms dealer—and the political lightning rod for who gets compute. - Seller of the H200 chip; lobbying for access to China while navigating export-control swings

The Trump administration just did the thing Washington has spent years swearing it wouldn’t do: let China buy a near-top-tier Nvidia AI chip again. Now a key China hawk in Congress is demanding the Commerce Department explain, in detail, why this isn’t a strategic own-goal.

Updated Dec 13, 2025