![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/ea4c7d39-bb1c-4f59-b46b-c795c3ee0536.jpeg)
I did a GDPR request for all my data, and then aimed the reddit-user-to-sqlite script at it – https://github.com/xavdid/reddit-user-to-sqlite
The metadata.json & recommendation to use datasette to interact with it makes finding old comments super easy. I’ve been going through all my comments sorted by date, clicking each permalink manually and editing-and-deleting them while bored during the workday (meetings, etc.). It has the added benefit of being incredibly difficult to figure out of I’m a bot or not.
I’m thinking of, instead, just pasting the content into chatgpt and editing the responses to leave it in place.
Maybe not “need”, but yes, a fully peer reviewed study confirming or rejecting seemingly obvious conclusions is an important part of the scientific method. It’s how we gain confidence in what we (think we) know.