As inspired by the bots on Reddit that respond to certain words, I’ve thrown together this code which allows anyone to set up their own response bot.
There is a bit more detail on Github, but in summary you can set your own trigger word and responses, and you have two modes of operation, “Exclude” which is the default and covers every community you’re federated with (and allows moderators of a community to PM the bot to exclude it) and “Include”, where you can pick a single community for the bot to be active in.
This is really early days and rough, but should work at the most basic level. Anyone who can provide some ideas/feedback/improvements - I’m totally open to them.
And to prove it works, I’m running Legolas Bot. Any comment you make below with the word “legolas” in will get a response (probably).
Small updates to reduce spaminess - will only reply to top level comments now.
Edit: Little updates include customisable polling rates and the ability to tag the comment creators name in a response.
Looking at the code, it seems to download the 25 newest comments every 5 seconds from the instance it is configured to go through. I understand that this is probably the easiest way to go, but that sounds also like a lot of useless traffic by downloading the same messages over and over. Isn’t there a way to make it more efficient and get only the unread messages? Lemmy does not support that? I guess one would need to fiddle in the code of an instance to do that efficiently.
It’s in json format so in reality it’s very little data. There’s no way (that I know of) to grab only “new” comments - I don’t think the lemmy api has anything like that.
Even if you put seen comments in a db you’ve still got to pull them to check if they’ve been seen or not which defeats the object.
25 every 5 seconds might be a touch overkill too but it does stop the bot missing any comments. I can certainly move them to variables that can be set in the env file/docker.
Edit to add: if it is locked down to one community then yes its way overkill, so will add them as variables and update docs to reflect.
I see, thanks. I guess at one point if it becomes problematic, instances can add hooks or light-weight calls.
I’ve pushed the change so operators can change those values in the env file or via docker. Btw let me know if you do start work on the megathread thing, it does pose an interesting challenge in terms of structuring posts and handling that data.
Huh, that’s surprising. The desktop web interface and some apps (Connect) already show the number of new comments on a post:
Seems strange that there is no way to identify which comments these are in the API. Might be a good feature request to propose for the Lemmy devs.
I could entirely be wrong, but I dont see anything obvious in the api that indicates this is a function of the api. You could potentially use markPostAsRead after scanning each comment, but I don’t see a way of pulling only new unread comments after that. Would love to be proven wrong though :)
You can call
https://lemmy.ml/api/v3/comment/list?limit=20&sort=New&type_=All
. In general I suggest trying things on the website and then checking in browser console which api endpoints it calls.Thanks, this doesn’t pull only unread comments - if I pull the latest 5 comments and then mark those overarching posts as read, I get this:
2024-02-02 09:52:11,278 - INFO - Requesting API Request.GET /comment/list 2024-02-02 09:52:11,507 - INFO - Requesting API Request.POST /post/mark_as_read Post ID = 9335073 Comment ID = 6915381 2024-02-02 09:52:11,629 - INFO - Requesting API Request.POST /post/mark_as_read Post ID = 9007864 Comment ID = 6915380 2024-02-02 09:52:11,742 - INFO - Requesting API Request.POST /post/mark_as_read Post ID = 9319139 Comment ID = 6915382 2024-02-02 09:52:11,916 - INFO - Requesting API Request.POST /post/mark_as_read Post ID = 9334778 Comment ID = 6915379 2024-02-02 09:52:12,100 - INFO - Requesting API Request.POST /post/mark_as_read Post ID = 9283396 Comment ID = 6915378
If I then pull the 5 latest comments again:
2024-02-02 09:52:12,238 - INFO - Requesting API Request.GET /comment/list 2024-02-02 09:52:12,380 - INFO - Requesting API Request.POST /post/mark_as_read Post ID = 9335073 Comment ID = 6915381 2024-02-02 09:52:12,521 - INFO - Requesting API Request.POST /post/mark_as_read Post ID = 9007864 Comment ID = 6915380 2024-02-02 09:52:12,673 - INFO - Requesting API Request.POST /post/mark_as_read Post ID = 9319139 Comment ID = 6915382 2024-02-02 09:52:12,835 - INFO - Requesting API Request.POST /post/mark_as_read Post ID = 9334778 Comment ID = 6915379 2024-02-02 09:52:12,977 - INFO - Requesting API Request.POST /post/mark_as_read Post ID = 9283396 Comment ID = 6915378
Which is the same 5 comments - so what I’m looking for is a way to pull only previously “unseen” comments - that would reduce the amount of data returned from the api each time i check the list if there was only 1 or 2 comments rather than returning all 25.
Apps can indicate that there are new unread comments on a post, but I assume they’re not doing this via the api and its a UI thing to do with caching?
I may not have explained myself clearly here, though!
On
GET /api/v3/post/list
there is a fieldposts[0].unread_comments
which the ui uses, probably based on mark as read endpoint. But that doesnt give you the comments themselves. So I think its better to call/api/v3/comment/list
like once a minute, the amount of data returned is nothing to worry about. Still if you want to minimize it, call with limit=1 and compare the comment to see how many you missed in between, then make additional requests for those comments you dont have yet.Nice solution, thank you :)
That’s exactly what my package which adds webhooks to Lemmy is for. It allows you to listen for events (like a new comment) and react to that. It even allows filtering, so you can filter for comments containing “legolas” at the webhook level and you only receive the event if it’s there. More info here.
Tagging @Demigodrick@lemmy.zip as well.
Thank you good sir/madam! It is thanks to this kind of effort that we are slowly making the fediverse a better place!