Performance

Analytics turn practice into feedback you can trust. Instead of guessing whether you are improving, you see trends by topic and difficulty. That matters because human memory is a poor historian—we overweight recent pain and underweight slow gains unless something charts them.

Performance by mode

When you are signed in, each finished run is grouped separately:

Study smarter, not only longer

Prospective students often over-revisit material they already know because it feels safe. Performance views nudge you toward high-leverage objectives—the ones that still wobble under time pressure. The point is not vanity metrics; it is to protect hours you cannot get back.

Combine this with a Learning plan so weekly goals stay realistic and measurable. If a metric improves but mocks do not, question whether you are practicing recognition without depth—then add Question bank variety or Custom practice on the stubborn cluster.

Reading charts without spiralling

Check analytics after a block of sessions, not after every single question. Noise shrinks with sample size. Look for direction over two weeks, not perfection after two days.

Signed-in history

Richer performance tracking generally requires an account so sessions persist across devices. If you are borrowing a device, sign out when finished. If numbers look flat despite effort, consider sleep, distraction, or splitting sessions—sometimes shorter runs produce cleaner data than one exhausted marathon.

Open in the test engine

Review trends in the Performance tab. Pair numbers with your own one-line notes after mocks for context the chart cannot see.