• Underestimating AI's long-term impact with Google DeepMind's Alexandra Belias

  • Oct 4 2023
  • Length: 46 mins
  • Podcast

Underestimating AI's long-term impact with Google DeepMind's Alexandra Belias

  • Summary

  • Underestimating AI's long-term impact - the 21st century's industrial revolution

    In this episode, we delve into the transformative world of AI, discussing how it's become the Industrial Revolution of the 21st century.

    Nowcasting: the high-resolution forecasting of rain up to two hours ahead, the generative model that ranked first for its accuracy and usefulness, Alphafold, which can accurately predict 3D models of protein structures and is accelerating research in nearly every field of biology and AlphaMissence which analyzes the effects of DNA mutations and will accelerate research into rare diseases.

    HANGAR: UN General Assembly & AI's Global Prominence:
    We kick things off at the United Nations General Assembly, where a new $5 billion commitment signifies AI's top-of-mind status this year, reminiscent of the previous year's focus on climate change. We explore how AI can support the United Nations' Sustainable Development Goals (SDGs) and its current role in "nowcasting."

    Risks and Governance in the AI Era:
    We dissect the risks of AI at both national and global levels and the need for a high-level advisory body on AI. The 2024 Summit of the Futures and the delicate balance between creating AI guardrails without stifling innovation come under scrutiny.

    EU Regulation and the Challenge of Risk Assessment:
    We examine EU regulations on high-risk AI systems and the complexities of scaling AI's risk in the face of rapid technological development. We ponder the necessity of taking risks for fundamental research in the era of "Grey-zone AI development."

    AlphaFold's Impact and the Global AI Race:
    AlphaFold's groundbreaking work in protein folding prediction is spotlighted. We discuss the ongoing global AI race, emphasizing the need for regulation, commercialization, brain drain, and education. We ask if the EU should replicate the US model with research centers or "lighthouses."

    US and UK Perspectives on AI Governance:
    We explore the chances of AI bills passing in the US and hear UK Deputy Prime Minister Oliver Downton's perspective on the unique nature of AI governance. We dive into AI's potential for advancing science, particularly in the medical field.

    Amara's Law and the Long-Term Impact of AI:
    We introduce Amara's Law and its relevance to our current AI hype. Are we underestimating what AI will achieve in the coming decade? We reflect on the most exciting aspects of working for DeepMind and the prevailing philosophy in AI today.

    AI Overhype and Focusing on Key Concerns:
    We address the overhype surrounding AI, drawing parallels with the start of the Industrial Revolution when people couldn't foresee the transformation ahead. We emphasize the challenge of maintaining focus amid the noise.

    Fairness, Equity, and the Global Digital Compact:
    Diversity and equity in AI development are discussed in-depth. How can AI be equitable when its developers largely come from a specific demographic? We introduce the concept of the Global Digital Compact.

    Favourite AI Tools and a Glimpse into the Future

    Supported by: International Visegrad Fund

    If you want better insights into challenges and decisions you or your business are facing, GARI’s analytical services are of unmatched complexity and high accuracy - whether your questions are on the green energy transition, trade and supply chains, or political and security related - contact us for a free consultation and see how you can optimise your decision-making.
    www.globari.org
    @LinkedIn
    @GARInstitute) / Twitter

    Show More Show Less
activate_Holiday_promo_in_buybox_DT_T2

What listeners say about Underestimating AI's long-term impact with Google DeepMind's Alexandra Belias

Average customer ratings

Reviews - Please select the tabs below to change the source of reviews.