Three Mystery Whales Have Each Spent $10 Billion-Plus On Nvidia’s AI Chips
Nvidia has three anonymous customers, referred to as Customers A, B, and C, who have each spent over $10 billion on the company's AI chips so far this year.
AI microchip supplier Nvidia, the world’s most valuable company by market cap, remains heavily dependent on a few anonymous customers that collectively contribute tens of billions of dollars in revenue.
The AI chip darling once again warned investors in its quarterly 10-Q filing to the SEC that it has key accounts so crucial that their orders each crossed the threshold of ten percent of Nvidia’s global consolidated turnover.
An elite trio of particularly deep-pocketed customers for example individually purchased between $10-$11 billion worth of goods and services across the first nine months that ended in late October.
Fortunately for Nvidia investors, this won’t change any time soon. Mandeep Singh, global head of technology research at Bloomberg Intelligence, says he believes founder and CEO Jensen Huang’s prediction that spending will not stop.
“The data center training market could hit $1 trillion without any real pullback,” by that point Nvidia share will almost certainly drop markedly from their current 90%. But it could still be in the hundreds of billions of dollars in revenue annually.
More on Nvidia’s A, B, and C customers on Yahoo Finance
Sir Paul Nurse, Demis Hassabis, Jennifer Doudna, and John Jumper | AI for Science
Join Professor Hannah Fry at the AI for Science Forum for a fascinating conversation with Google DeepMind CEO Demis Hassabis.
They explore how AI is revolutionizing scientific discovery, delving into topics like the nuclear pore complex, plastic-eating enzymes, quantum computing, and the power of Turing machines.
The episode also features a special 'ask me anything' session with Nobel Laureates Sir Paul Nurse, Jennifer Doudna, and John Jumper, who answer audience questions about the future of AI in science.
Apple Is Reportedly Working On ‘LLM Siri’ To Compete With ChatGPT
A more conversational Siri could arrive in 2026, according to Bloomberg.
Apple is planning a major AI overhaul of Siri that will make the digital assistant more like OpenAI’s ChatGPT and Google’s Gemini Live, according to a report from Bloomberg’s Mark Gurman. The assistant, reportedly called “LLM Siri,” will be powered by Apple’s AI models and will let users have more conversational, natural-sounding interactions with Siri.
As part of Apple’s plan to infuse its iPhones with AI, Bloomberg says the company will make Siri better at handling more advanced tasks. The assistant will have an “expanded” ability to use App Intents to interact with third-party apps, while also using Apple Intelligence to summarize and write text, too.
Though Apple could announce these plans as soon as next year, the company intends to replace Siri’s underlying software with the new system in the spring of 2026, according to Bloomberg. During an interview with The Wall Street Journal last month, Apple senior vice president of software Craig Federighi hinted at how an AI-enhanced Siri will stand out from ChatGPT:
“The properties of something like OpenAI advanced voice mode and Siri are quite different. That OpenAI mode is great if you want to go ask a question about quantum mechanics and have it write a poem for you about it... It’s not going to open your garage. It’s not going to help you send a text message. There are many, many useful things Siri does for you every day, does them quickly and locally on your device. There’s a spectrum here, there’s a tradeoff across capabilities. Will these worlds converge? Of course, that’s where the direction is going.”
More about Apple’s LLM Siri on The Verge
How to Test a New AI Chip | IBM Research
So your team of semiconductor researchers have designed and built a brand new type of chip that has the potential to be far more efficient for AI computations. But how do they test that it actually works like they intended?
That’s where IBM Research Hardware Engineer JohnDavid Lancaster comes in. In this behind-the-scenes look at his lab in Yorktown Heights, he demonstrates his team tests and validates the “power budget” — a “wattage-to-inference” ratio — of IBM’s latest AIU (Artificial Intelligence Unit) chips, which were specifically designed and optimized for AI applications.
“One of the key reasons you would use our AIU card is the super low power for how many inferences you’re able to compute.” Lancaster shows how mapping the voltage of the entire card can ultimately help inform sustainable (low-power) encoder and decoder model design.
OpenAI Updates GPT-4o, Reclaiming Its Crown For Best AI Model
If you use ChatGPT for writing, you'll be happy to hear this.
OpenAI has released several new models lately, but GPT-4o remains its most advanced flagship model. It packs advanced reasoning, multimodality, and conversational capabilities into one, and with the latest update, it has become even more capable.
On Wednesday, OpenAI announced via an X post that it updated GPT-4o to improve overall performance, including better file reading and writing capabilities that allow it to generate more natural and engaging text.
So, just how good is the upgrade? Over the past week, before OpenAI announced the launch and the update, known as ChatGPT-4o (20241120), the company tested its performance on the Chatbot Arena LLM Leaderboard, a crowdsourced platform used to evaluate large language models (LLMs).
Users chatting with the two LLMs side by side and comparing their responses without knowing the models' names put GPT-4o in first place, above Gemini-Exp-1114. OpenAI also improved GPT-4o in several other categories, including creative writing, coding, and hard prompts, all of which it now ranks first in. To put them to the test yourself, you can visit the Chatbot Arena and vote for free.
More about OpenAI’s GPT-4o upgrade
GitHub: Can Europe Win the Age of AI?
GitHub CEO Thomas Dohmke discusses Europe's readiness to lead the next era of AI innovation, examining how the continent's tech ecosystems stack up against those in the US.
In conversation with TEDAI Vienna co-curator Vlad Gozman, Dohmke explains the three key shifts that will help Europe thrive in the age of AI — and shows how GitHub's initiatives can empower anyone to build new ideas around the world.
Google Cloud Launches AI Agent Space Amid Rising Competition
The cloud computing wars have swiftly morphed into the AI wars, with leading cloud computing divisions Google Cloud, Microsoft Azure, and Amazon Web Services (AWS) all rolling out new tools for customers to access, use, deploy, and build atop a range of AI models.
Therefore, it was not too surprising to learn his week that Google Cloud was offering a new AI agent ecosystem program called AI Agent Space. This initiative empowers businesses to discover, deploy, and co-create AI agents designed to automate tasks, enhance customer experiences, and optimize operations.
With a growing focus on the enterprise, Google’s announcement positions it as a major player alongside competitors like Microsoft, SAP, and Salesforce.
Google will be promoting the hell out of any agents you build and make available in its AI Agent Space
Google’s ecosystem is built around enabling partners to develop highly customizable AI agents by providing them with robust tools and resources, including early previews of Google’s AI technologies, direct support from engineering teams, and best practices to streamline development.
In addition, Google says it will promote new agents through its Google Cloud Marketplace to allow partners to scale the agents they make to new, interested audiences.
More about Google Cloud’s new AI Agent Space
A Conversation with Dario Amodei, Co-Founder and CEO of Anthropic
In this fireside chat at BoxWorks 2024, Box Co-founder and CEO Aaron Levie joins Anthropic Co-founder and CEO Dario Amodei to discuss how AI will accelerate and transform work for every business on the planet.
Thats all for today, however new advancements, investments, and partnerships are happening as you read this. AI is moving fast, subscribe today to stay informed.