I could certainly improve the workflow and automate a lot more if I wanted a live dashboard or to do this regularly, but without any API this is what was used to get the job done for now:
  1. visit stacker.news front page as I mentioned above under 'notes'
  2. tap 'more' button to see more than 21 posts at a time. repeat 390 times
  3. right-click on page and save .html file locally
  4. created a python script that recursively spits out and renders all the front-end post content (title, author, territories, sats, cowboy hat, comments etc) into a csv
  5. copy that CSV into a spreadsheet with new columns with formulas to sort and categorise content better
  6. create a bunch of pivot tables & graphs in said spreadsheet
  7. screenshot those graphs & upload to SN
If I was to make this a continued process, I would automate Steps 2 and 5 - making sure the data got pushed to Tableau or something as capable. For a one-off exercise like this however, it's more than sufficient.
And next time I want to click a button 390+ times to get an updated view, I can simply paste the data over the existing spreadsheet data and all the pivot tables automatically update.
Try this to automate step 2. For $12/y the PRO version could be really useful and can save you a lot of time (and clicks!)
reply