Abstract: The proliferation of powerful large language “foundation models” has made AI more visible in daily life and enabled an explosion in new tools and capabilities that is still ongoing. In the domain sciences, there is equal potential to transform our understanding and predictive capabilities. In this talk, I will demystify what these foundation models are, how and why we are able to train them now, and the transfer learning process that enables widespread downstream tools and applications. In astrobiology, we have the added challenge of creating models that generalize to highly novel environments and contexts - so we will give special attention to model training and evaluation approaches for astrobiology data and applications. |