Calvin: Actually, the total number of respondants of a poll does *not* need to approach the total number of BBS members, provided of course, the goal is to measure the BBS members. If that's the case, then based on how confident you want the result of the poll to be, (e.g. 95% confidence) and how accurate (+3 x%) you can actually mathematically work out the *exact* number of answers you need for a fairly good and reasonable answer.
This would all be very true if the survey/poll were conducted with some semblance of a random sampling method and if the design/analysis took account of the potential biases (esp. of non-responders). For reasons that ninti elaborated on ("Not to mention the fact that it is a self-selected poll, and some groups may have more of an desire to answer than others, despite the anonymous nature. ...") this is hardly the case.
At least compared to the many self-selected Insta-Polls conducted by news organizations, this poll has a reasonably estimable N of possible respondants. Even then, the calculation of a traditional 90, 95% (or whatever) confidence interval would not be appropriate. About the best you can do on a self-selected poll like this (with a known possible respondant N) is to perform a sensitivity test -- take the n of non-responses and build 2 cases: Case 1 where all non-responses are assumed to be "Yes" and Case 2 where all non-responses are assumed to be "No". As you might guess, this gets a lot harder with an analysis of multiple-choice questions.
And typically yes, the number of poll respondants are the tiniest fraction of the membership of the sampling group. Strange but true.
Can you tell that Insta-Polls drive me nuts???
_________________________
Jim
'Tis the exceptional fellow who lies awake at night thinking of his successes.