People give out more information than required when filling out forms online—even when they don’t have to. That’s one of the findings from a study on voluntary over-disclosure in web forms. The other interesting finding:
- When form fields are mandatory, it causes people to skip optional fields further on in the form
Web forms look like surveys but the context is different—you’re trying to get something, not just answering questions. So I wondered: does this apply to surveys too? Do people actually want to answer as few questions as possible? And do mandatory questions make them skip optional ones?
I ran a simple experiment. I recruited 100 people to take a short two-page survey. For half, the first page was all mandatory questions; for the other half, everything was optional. The second page was optional for everyone.
I measured two things:
- How many of the ten optional questions were answered?
- What was the length of response for the final optional question, which was an open textbox question?
Similarly to the web form study, the survey showed that people answered more questions than required. However, mandatory questions didn’t affect how many questions were answered further on in the survey.
| Condition | # People | Mean number of optional questions answered | Mean number of characters in open response |
|---|---|---|---|
| Mandatory | 43 | 9.9 | 42.8 |
| Optional | 57 | 9.9 | 40.1 |
People answer survey questions they don’t have to—contrary to the common belief that everyone wants to escape surveys as fast as possible. My experiment was small and different enough from the web form study that I can’t say the original findings are wrong. But the contradictory results suggest you shouldn’t assume ideas from one context transfer cleanly to another.
The dataset is available here if you want to poke around or run your own analysis.