AI News Digest, May 7: AI Chip IPO Tests the Enterprise AI Thesis

Experience AI in HR

Table of Contents

AI chip IPO tests the enterprise AI thesis - Cerebras Nasdaq listing on May 7 2026

Wall Street wants to know if AI hardware deserves a Nasdaq listing. Meanwhile, enterprise buyers want to know if AI workers deserve their roles. This week, in particular, the two questions are colliding. An AI chip IPO is testing the enterprise AI thesis. Meanwhile, a Greenhouse acquisition is testing whether recruiters can outrun AI applicants. Coinbase and Freshworks are testing how candid you can be about AI replacing roles. And Krutrim is testing whether national AI champions still need frontier models. Here is what each move tells you about the year ahead.

AI Chip IPO Hits Wall Street as Cerebras Files for $3.5 Billion Raise

On May 5, Cerebras Systems filed updated S-1 paperwork to list on Nasdaq under “CBRS” (CNBC). Specifically, the plan is 28 million Class A shares at $115 to $125 each. At the top of the range, therefore, the AI chip IPO would raise about $3.5 billion. The implied market value lands near $26.6 billion. However, the book is reportedly oversubscribed at roughly $10 billion of orders, so the final price could land higher (Bloomberg).

In particular, the S-1 anchors the story on a single customer: OpenAI. Specifically, the prospectus describes a multi-year compute deal worth more than $20 billion. Moreover, it mentions a 750 megawatt OpenAI commitment running through 2028 (TechCrunch). For context, Cerebras posted $510 million in 2025 revenue, up 76 percent year on year, against an operating loss of $146 million.

Why this AI chip IPO matters for AI buyers

Three things change once an AI chip IPO of this scale prices.

First, the public market gets to vote on the AI infrastructure thesis. Until now, AI hardware valuations were set by private rounds at any figure. A successful listing sets a referenced price for compute capacity. Second, it certifies a credible Nvidia alternative. Cerebras’ wafer-scale design is genuinely different. If buyers can hedge compute exposure across two or three vendors, inference costs drop faster than consensus assumes. Third, OpenAI gets equity exposure to its own supplier. The S-1 mentions the equity stake explicitly (The Information). That is the new vendor pattern: equity tied to multi-year purchase commitments.

For HR and finance leaders, the practical implication is concrete. Are you paying enterprise AI vendors today? Ask what their compute cost looks like in 18 months under a two-supplier scenario. Then ask whether the per-seat pricing they sold you reflects that future cost, or just last quarter’s. Because the AI chip IPO wave signals broader compute supply, the AI vendor pricing conversation is about to get more useful. For a baseline read on the buyer-side moat questions, see our piece on top AI tools for HR.

Greenhouse buys an AI voice interviewer to fight AI applicants

Greenhouse signed a definitive agreement on May 5 to acquire Ezra AI Labs (Greenhouse Newsroom). In short, Ezra is a 2024-founded voice AI interviewer. Specifically, it runs structured candidate conversations and integrates with the existing applicant tracking system. The deal closes this quarter.

Greenhouse’s own data points to why. First, applications per recruiter on the platform have spiked 412 percent since 2023. Second, 74 percent of candidates now use AI in their job search. Meanwhile, 46 percent of candidates say their trust in hiring has dropped this past year (PR Newswire).

So the IPO conversation is happening upstream while the AI applicant flood is happening downstream. Both pressure the same buyer. Run recruiting at a 50 to 500 person company, and the question becomes simple. Do AI voice interviews stay neutral signal? Or do they become another arms race? Either way, expect AI applicant tracking systems to start bundling structured AI interviews as a default screen, not an add-on. For the broader recruiting picture, our take on AI in HR recruitment covers the workflow side.

Coinbase and Freshworks cut headcount as AI lands inside the org chart

On the same day as the Cerebras filing, Coinbase confirmed it would cut about 700 roles. That is roughly 14 percent of its workforce, as it shifts to an “AI-native” operating model (CNBC). At the same time, Freshworks said it would cut 11 percent of its global workforce, about 500 jobs (TechRadar).

Coinbase CEO Brian Armstrong was specific. For example, AI lets engineers ship in days what teams used to ship in weeks. In addition, non-technical staff are now writing code. Meanwhile, many internal workflows are being automated. As a result, Coinbase will book $50 to $60 million in restructuring charges in Q2, mostly cash severance (Yahoo Finance).

In short, tech layoffs in 2026 now approach 100,000. Specifically, AI is cited as a contributing factor in roughly one in eight cuts. The honest read: some of these announcements are AI-washing. Still, some are not. For founders, the practical question is what your operating model looks like once agentic tools compress two roles into one. For HR leaders, the harder question is whether your AI skills gap plan covers the people you will keep, not just the ones you will hire.

Krutrim swaps frontier AI for cloud profit, books first net income

Bhavish Aggarwal’s Krutrim said May 5 that it is pivoting strategy (TechCrunch). Specifically, India’s first GenAI unicorn is moving from frontier model training to a full-stack AI cloud platform for enterprises. As a result, chip work and frontier-model work are paused. Meanwhile, revenue tripled to about ₹300 crore in FY26. The firm posted a first annual net profit, with margins above 10 percent (Medianama).

In short, the pivot reads like an honest emerging-markets AI playbook. For example, Krutrim now serves more than 25 enterprise customers. They include telcos and BFSI names. Specifically, the AI cloud is sold under data-residency and India-stack pretexts. So while the IPO drama plays out on Nasdaq, India’s counterpart is selling the picks-and-shovels layer above it. That means GPU access, India-localized inference, and managed agent runtimes for regulated industries.

For founders running on Indian customer accounts, the implication is simple. Sovereign AI infrastructure now has a domestic, profitable vendor. Therefore, vendor selection shifts. Not because the model layer is special, but because data residency is enforceable inside Indian jurisdiction. That matters more once the compressed DPDP timeline lands later this year.

Quick Hits

  • Production multi-agent debugging crosses a threshold. A May arXiv paper (2605.03505) reports an LLM multi-agent root-cause system at 91.3 percent accuracy on real production microservices. The single-agent ReAct baseline scored 39.8 percent. Anchor stat for any “agents are not ready” pushback.
  • EPAM and Anthropic ink a Claude services pact. EPAM Systems and Anthropic announced a multi-year deal on May 6 (PR Newswire). EPAM is committing to 10,000 Claude-certified architects. Services-led distribution is now a real Anthropic motion, not just an Accenture story.
  • South Korea funds a 15,000-GPU national compute center. South Korea’s National Growth Fund approved roughly 8.4 trillion won (about $5.7 billion) of AI investments through April (UPI). The APAC sovereign-AI race is now genuinely funded, not just announced.

If today’s AI chip IPO momentum looks familiar, that is because the AI infrastructure thesis is being priced in two places at once. Public markets fund the chips. Enterprise contracts pay for the seats above them. Asanify’s HCM software sits in that seats layer, with API-first hooks and a data-residency posture.

FAQ on the AI Chip IPO Wave

What is the Cerebras AI chip IPO and why is it a big deal?

Cerebras Systems is an AI hardware startup. Its wafer-scale processors are designed as alternatives to Nvidia GPUs for AI training and inference. Its updated S-1 filing on May 5 targets a Nasdaq listing at $115 to $125 a share, raising roughly $3.5 billion at a $26.6 billion valuation. The AI chip IPO matters because it is the first public-market reference price for AI compute capacity outside Nvidia. OpenAI is the multi-billion-dollar anchor customer with an equity stake.

How does the AI chip IPO affect HR and operations leaders?

It does not change anything about your tools today, but it changes the supply-side economics within 12 to 24 months. When buyers gain a credible Nvidia alternative, the per-token cost of inference tracks down faster. That feeds into how AI vendors price seats and consumption tiers. Use the IPO as a prompt to renegotiate AI software contracts. Push for shorter terms, model-portability clauses, and explicit price-step protections.

Why are companies like Coinbase laying off staff during the AI chip IPO frenzy?

Coinbase, Freshworks, and others framed their May 2026 layoffs as a shift to AI-native operating models. They were not framed as cost cuts driven by weak demand. The AI chip IPO is a signal that compute capacity for AI workloads keeps growing. That lowers the cost barrier for companies to automate routine work. For HR leaders, the practical step is to close the AI skills gap for the staff you keep, not just the ones you hire.

Not to be considered as tax, legal, financial or HR advice. Regulations change over time so please consult a lawyer, accountant  or Labour Law  expert for specific guidance.

Simplify HR Management & Payroll Globally

Hassle-free HR and Payroll solution for your Employess Globally

Your 1-stop solution for end to end HR Management