Frequently Asked Questions
No! The process is completely automated. Our AI analyzes the camera feed as the candidate attempts it and then detects the violations. There is no human anywhere in the process.
Do read our Privacy Policy to understand this better. But, unlike most other proctoring tools, we don't store the full exam session. We only store evidence of violations (and a few Random photos, if enabled). So if a candidate doesn't do anything suspicious, no data will be stored.
Secondly, we are compliant with a whole host of Student Privacy laws in North America, and GDPR in Europe. Do read our Privacy Policy for more details.
Lastly, we do not sell any of this data. As outlined in the Privacy Policy, the data are automatically erased within a few months of them being generated
If camera or mic access has not been granted or revoked, we don't let the candidate continue with the test. So, they will have to grant access to see the questions. If they block the camera with their hand, etc then you will see it as a violation and the Trust Score will be lower.
Like with most AI applications, AutoProctor too can make mistakes. Even though you may see a face on the photo, it may categorize it as No Face Detected. In almost all cases, it is because (i) the candidate isn't in a well-lit environment, (ii) is not looking directly at the camera, (iii) doesn't have a plain background, (iv) isn't being fully covered by the camera feed. By changing these things about the test-taking environment, the candidate can get the AI to recognise their face.
Noise detection is fairly accurate. You may see a few false positives, because AutoProctor cannot distinguish between human and environmental noise. So, you'll have to play the audio and then decide if the candidate was cheating.