1. 2
    1. 6

      Slightly misleading title IMO– this particular project doesn’t represent a new step taken by LLMs or anything like that, moreso a novel combination of Telegram, the Chat API, and the Transcription API. There’s a notable difference between “understands text and voice” and “transcribes audio to text in addition to supporting normal text input.”

      Otherwise, I’m not sure there’s too much to discuss here. This is a pretty straightforward usage of OpenAI’s API for a Telegram bot.

      1. 1

        Seeing that this is your project, OP, I hope I didn’t come off as being too dismissive. I’m sure it was interesting to work on, Telegram’s support for bots is a lot of fun.

        Were there any hurdles you ran into while working on this, or do you have any stories about interesting problems you solved?

        1. 1

          There’s no secret to it – I simply integrated the GPT chat into Telegram and added a voice transcription feature. It’s my hope that this functionality will be useful to someone.

        2. 4

          not at all. the llms are not capable of understanding anything.

          1. 2

            I still struggle to call it understanding (it’s usage here being some perverse algorithmisation of cognition) but LLM’s do seem to have some kind of “world model” they operate on, suggesting some kind of meaningful (vile usage but again, in some waffly “composed of meanings of words and their relationships”) computation: https://thegradient.pub/othello/

            1. 1

              What is your justification for stating that the LLM cannot understand?

              Is it based on the way that they are implemented internally or is it based on assessing the outputs it gives to a certain set of inputs?

            2. 1

              Lobsters isn’t just a place for advertising your own stuff, at least half of your contributions should be submissions of other people’s content or useful comments on other people’s posts.

            🇬🇧 The UK geoblock is lifted, hopefully permanently.