Honeypot Trap
A honeypot trap is a hidden element placed on a web page specifically to catch automated clients. Common forms include invisible links (`display: none` or positioned off-screen), zero-pixel form fields, or URLs in robots.txt-disallowed directories that no legitimate user would ever visit. A real human never interacts with these elements; a naive scraper that follows every link or fills every form field walks straight into them. Once a client interacts with a honeypot, the server flags the IP, fingerprint, or session as a bot and blocks subsequent requests. Honeypots are cheap to deploy and remarkably effective against unsophisticated scrapers. They are part of nearly every modern anti-bot stack, layered alongside fingerprinting, rate limiting, and behavioral analysis. The defense is simple but requires care: respect CSS visibility rules, do not click invisible links, do not fill non-displayed form fields, and validate that anchor targets are reachable through the rendered page. For AI builders, honeypot avoidance is one reason to use a real headless browser rather than a plain HTTP fetch — a headless browser sees the rendered page and naturally ignores hidden elements. Stealth-mode browser images and managed scraping platforms also handle honeypot avoidance by default. Building your own scraper means writing the logic that distinguishes visible, interactive elements from hidden traps.