Is there a way to access engagement stats for a community to make a (daily) time series? Something along the lines of exporting data ever 24 hours at 2:00 UTC.
This is the data I am referring to:
Perhaps an API of some sort. A formal method that goes beyond writing a scrapper.
Community statistics are available from the Lemmy API. For example, this URL returns the info for this community: https://lemm.ee/api/v3/community?name=fedigrow
The community statistics are listed in the returned data under
counts:
.The API only returns current numbers, though. To create a time series you need to write a script to grab the community statistics on a regular basis and store the results.
Thanks! This seems like the best area for further research.
Any chance that historical statistics like these could be incorporated into lemmyverse.net?
IDK if this counts as “writing a scraper,” but from
tmux
you could:echo '"Daily Users","Weekly Users","Monthly Users","6-Month Users","Community ID","Subscribers","Posts","Comments"' > stats.csv while true; do curl "https://YOUR_INSTANCE/api/v3/community?name=COMMUNITY_NAME" \ | jq -r '[.community_view.counts | (.users_active_day, .users_active_week, .users_active_month, .users_active_half_year, .community_id, .subscribers, .posts, .comments)] | @csv' >> stats.csv sleep 24h done
Edit: No auth needed, it’s simple
That request doesnt need any auth.
Oop, you’re right, fixed.