There's not just work for beta testers, there's risk. If your car gets into an accident on FSD because FSD does something wrong, then you're responsible. Even if nobody is injured or killed, the points will go on your driving license, your insurance rates will go up, etc. FSD beta testers should never forget this.
I haven't seen any scientific studies of just how risky the FSD beta is, but my subjective impression is that it's very poor compared to even an average human driver. My own nickname for FSD is "Flaky Student Driver," and I think that sums it up better than any other three words I've heard. In my experience, it's more stressful and difficult to drive with FSD beta active than to drive manually, because I have to do all the mental tasks of driving normally (watching traffic and pedestrians, planning what to do if somebody does something stupid, etc.), plus try to anticipate what my own car will do and plan for its stupid moves.
That said, FSD is technically impressive, despite its flaws. It does things that were science fiction not so long ago. (Remember the TV show "Knight Rider?") Being technically impressive and being useful for everyday driving are two different things, though.
Despite being pretty "down" on the state of FSD as a useful feature, I do think that it needs to be tested. I do believe that self-driving technology (whether Tesla's or somebody else's, and most likely multiple development paths) has the potential to save a lot of lives in the long run. I don't know of a way to get there without testing the technology on real roads. What I have a problem with is Tesla's (and especially Musk's) marketing and sales practices, which make FSD seem more capable and less risky than it is, resulting in faster and more widespread deployment than is advisable.