The Future of Language Models and Transformers

Share the event

Transformers have now been scaled to vast amounts of static data. This approach has been so successful it has forced the research community to ask, “What’s next?”. This workshop will bring together researchers thinking about questions related to the future of language models beyond the current standard model. The workshop is meant to be exploratory and welcome to novel vectors in which new setups may arise, e.g. data efficiency, training paradigms, and architectures.

This workshop is part of the programming for the thematic semester on Large Language Models and Transformers, organized in collaboration with the Simons Institute for the Theory of Computing.

Travel grants are available to attend the event in California.

Workshops will also be available online and live (registration required).

Organizer

Sasha Rush (Cornell University; chair)