• NeatNit@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    5
    ·
    4 days ago

    To add to that, credit where credit is due, LLMs can often pick up on things like this. Machine translation has been LLM-based (or some primitive ancestors of LLM) for many years even before the AI boom. So AI probably helped a bit here.

    That’s my wild guess. I wouldn’t call it a hypothesis, I’m just talking out of my ass.

    • Elting@piefed.social
      link
      fedilink
      English
      arrow-up
      48
      arrow-down
      1
      ·
      4 days ago

      Translation might be the only thing they genuinely do better than older tools.

      • lugal@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        24
        ·
        edit-2
        4 days ago

        There are other usages in computer linguistics. My master thesis was a neural parser. Other usages are in pattern recognition in medicine for example. But your point stands that often it makes things worse

        • Elting@piefed.social
          link
          fedilink
          English
          arrow-up
          10
          ·
          4 days ago

          I had heard about the medicine thing actually. When the use case actually lines up with what it is, it makes sense as a tool. It’s that old adage though “When you have a hammer, everything looks like a nail.”

        • BaroqueInMind@piefed.social
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 days ago

          Is there any way I can read your thesis? I’m casually curious, and also have no idea if college thesis are allowed to be shared online with rando people like me.

          • lugal@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 days ago

            It depends in part in your ability to read German 😅 I wrote another comment elaborating a little and giving clues for “further reading”

          • lugal@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            4
            ·
            4 days ago

            Well, it parses natural language. In linguistics, or syntax to be precise, there are different ideas on how to build syntax trees. The most common is Dependency Grammar, basically just a tree where every word points to the word it refers to (the adjective to the noun, the subject and the object to the verb, the verb is the root). I applied this to a different syntax theory called Role and Reference Grammar. You can google the latter, if you want to look into neural parsers in general, stanfordNLP has modules for python and I think online tools as well and stuff.

    • grissino@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      A hypothesis is basically a guess based on logical assumptions so you are there already.