Laden...
This is a free article made possible by Big Technology subscribers with no advertising included. To support my independent tech journalism, please consider subscribing for $8 per month. Universities Are Woefully Under-Resourced For AI Research. They’re Fighting To Change That.Stanford University has 300 GPUs. Microsoft will have 1.8 million. Here's what's needed to make academia relevant in AI research.
In front of a packedroom inside the Dirksen Senate Office Building in Washington, D.C. on Tuesday, the leadership of Stanford’s Institute for Human-Centered AI made a plea. Co-director Fei-Fei Li and executive director Russell Wald told the approximately 200 congressional staffers gathered that universities today simply don’t have the resources to do basic generative AI research. The chips, data centers, and energy costs are not in scope for university budgets. And they need serious help. “All U.S. universities combined could not build a version of ChatGPT right now,” Wald told me in an interview this week. “That's pretty problematic.” The dire situation is reaching a boiling point. Microsoft is aiming to have 1.8 million GPUs by the end next month while Stanford University has approximately 300 of these chips, which are the critical components for testing and training generative AI. Wald then recited some stunning statistics showing how far universities have fallen behind the private sector in an area where they previously led. In 2022, Wald said, there were 22 significant AI breakthroughs that came from industry, compared to only three from academia. In 2011, AI PhDs would go into private industry and academia in about equal numbers, now 70% go into private industry, he said. Professors and grad students interested in doing cutting edge research also struggle to choose universities because the resources available will not allow them to do the work. “It's deeply concerning about the future of the technology,” Wald said, “because it begs the question — Does the academy belong in frontier AI research? We at HAI would argue yes it does.” University technology research, Wald said, is necessary because it’s freed from the intense product-focused work that is done within companies, helping it spark fundamental breakthroughs like GPS, MRIs, and the internet. “These are things that no investor would ever put any money into, because they wouldn't see a return on investment,” he said. “They'd be dead by the time they saw any really true profitable margin out of this.” Wald also noted university AI research can help train regulators to understand AI, so that knowledge extends beyond those within the companies themselves. And it can serve the public good by publishing research and open-sourcing models given that companies like OpenAI and Google have stopped sharing their cutting-edge research as the field has grown ultra-competitive. SolutionsSo, universities have a serious AI resource problem right now. Along with not having the money to invest in AI data centers (an NVIDIA GPU can cost up to $40,000 per chip) they are facing challenges in setting up facilities with enough space and cooling capacity to operate. And the energy costs add up as well. But seeing the problem, some academic institutions such as Stanford and a new group in New York are fighting to fix it. In New York, a first-of-its-kind coalition of universities, the state, and private philanthropy has just built an artificial intelligence computing center that started running experiments last week. The initiative, called Empire AI, is set up as a shared resource at The University of Buffalo and includes NYU, Cornell, Columbia, RPI, SUNY, and CUNY. The group received a donation of GPUs from the Simons Foundation, and support from Empire State Development. “We're just in the process of testing Empire AI, we’re starting to run the first jobs,” said Stacie Grossman Bloom NYU’s chief research officer, vice provost, and vice chancellor for global research and innovation. NYU researchers are already using the machine to detect deep fakes. The university, Grossman Bloom said, has 19 work order requests in, with more to come. “The idea is for the machine to not be idle at all. But to really run it, run it hard and, run it at capacity,” she said. Empire AI currently has 96 NVIDIA H100 GPUS, which, even at a multi-million dollar value, don’t come close to what companies like Amazon, xAI, Meta, Microsoft, OpenAI are using to build, interrogate, and run their models. But the coalition does plan to meaningfully expand the compute resources within Empire AI as it moves beyond its ‘alpha’ phase. Such initiatives ultimately won’t put universities in position to do foundational research without a massive increase in resources — and that’s where the federal government might come in. Stanford’s Wald and Li were in DC advocating for Congress to pass the “Create AI” act, a bill that would make permanent a shared resource that would enable large scale AI research. This “National Artificial Intelligence Research Resource,” which has a pilot established by the Biden administration through executive order, would make compute and data sets available to universities and students to run AI experiments. The U.S. government and partner companies like NVIDIA and Microsoft have already put tens of millions of dollars into the project, but the bill would establish it in law and help it get funded through federal government appropriations, via the National Science Foundation. The current funding number discussed within Congress is about $400 to $500 million per year, Wald said, over the course of six years. It would be a paradigm changing bill if passed. Wald said that he and Li set up this week’s gathering with Congressional staffers last Friday. Initially, only ten signups came in and they began to worry. But the event became so popular that a line wrapped around the hall as people scrambled to get in. The Create AI bill has momentum within Congress today. There’s bipartisan support in both houses. Seventy eight companies, universities, and institutions, from Princeton to Google, just wrote leadership this week supporting it. And it could pass during the lame duck session. But Wald said the bill’s fate hinges on legislators agreeing to get it done in a very busy period, effectively giving it a six week window before Congress turns over. “It is probably one of the top easy things to pass on any AI agenda,” Wald said. And now, the only question is whether the will is there to do it. Advertise on Big Technology? Reach 150,000+ plugged-in tech readers with your company’s latest campaign, product, or thought leadership. To learn more, write [email protected] or reply to this email. What Else I’m Reading, Etc.Copilot is flailing as costs and obstacles add up [Business Insider] Google spinning off Chrome is impractical and unlikely [Spyglass] Clear is making it normal to submit your biometrics, but at what cost? [MIT Tech Review] The Jaguar rebrand is a disaster [WSJ] Big Technology Podcast Q&A with Rivian CEO RJ Scaringe [Big Technology] I joined CNBC to talk about OpenAI’s plan to build a browser [YouTube] Number of The Week94% NVIDIA nearly doubled its revenue in Q3 2024, up 94% compared to 2023. Though its pace of growth is set to slow down considerably over time. Quote of The WeekFoundation model pre-training scaling is intact and it's continuing. NVIDIA CEO Jensen Huang told analysts that AI progress hasn’t stalled as he spoke with them during an earnings update this week. This Week on Big Technology Podcast: Rivian's CEO on Elon's Influence, The Plan For Profitability, And The Grid’s Risky FutureRJ Scaringe is the CEO of Rivian. He joins Big Technology Podcast for a deep discussion about the state of electric vehicles and where Rivian goes from here. We go into depth about Elon Musk's role in the coming Trump administration and what Scaringe would do if he was the "first buddy." Then we talk about Rivian's affordability, its partnership with VW, its road to profitability, and more. Stay tuned for the second half where we discuss energy and the grid — and Scaringe shares a passionate stance on plug-in hybrids. Thanks again for reading. Please share Big Technology if you like it! And hit that Like Button to support a shared resource we can all get behind: massive like counts on this newsletter My book Always Day One digs into the tech giants’ inner workings, focusing on automation and culture. I’d be thrilled if you’d give it a read. You can find it here. Questions? News tips? Email me by responding to this email, or by writing [email protected] Or find me on Signal at 516-695-8680 Thank you for reading Big Technology! Paid subscribers get our weekly column, breaking news insights from a panel of experts, monthly stories from Amazon vet Kristi Coulter, and plenty more. Please consider signing up here.
© 2024 Alex Kantrowitz |
Laden...
Laden...