AI-generated text is a problem in academia but it isn't new. Back in like '06 some MIT students got an algorithmically generated paper peer review and published.
Fake microscopy data on the other hand, is a major threat. It's a faked measurement of the world. Whoever can solve the issue of ensuring image authenticity will be very rich.
AI-generated text is a problem in academia but it isn't new. Back in like '06 some MIT students got an algorithmically generated paper peer review and published.
Fake microscopy data on the other hand, is a major threat. It's a faked measurement of the world. Whoever can solve the issue of ensuring image authenticity will be very rich.
That does seem a little concerning and something I never thought about. Are there any examples of people faking data like that yet?
This link was posted by headalgorithm 42 minutes ago on HN. It received 44 points and 25 comments.