Breakthrough Technique: Meta-learning for Compositionality

Original :
https://www.nature.com/articles/s41586-023-06668-3

Vulgarization :
https://scitechdaily.com/the-future-of-machine-learning-a-new-breakthrough-technique/

How MLC Works
In exploring the possibility of bolstering compositional learning in neural networks, the researchers created MLC, a novel learning procedure in which a neural network is continuously updated to improve its skills over a series of episodes. In an episode, MLC receives a new word and is asked to use it compositionally—for instance, to take the word “jump” and then create new word combinations, such as “jump twice” or “jump around right twice.” MLC then receives a new episode that features a different word, and so on, each time improving the network’s compositional skills.

  • A_A@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Good to know, Thanks.
    Yours is the type of comment I was really hoping to read here.

    You are right : it’s the same authors (Brenden M. Lake & Marco Baroni) with mostly the same content.

    But, they also write (in nature) that modern systems (GPT-4) do not yet incorporate these abilities :

    Preliminary experiments reported in Supplementary Information 3 suggest that systematicity is still a challenge, or at the very least an open question, even for recent large language models such as GPT-4.

    • DigitalMus
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      This certainly could be part of the motivation for publishing it this way, to make themselves more noticed by the big players. Btw, publishing in open source nature is expensive, it’s like 6-8000 euro for the big ones, so there definitely is a reason.