Chrome-Lighthouse

Neutral Block: Depends Verified robots.txt: No

Chrome-Lighthouse is an automated, open-source tool for auditing web page quality and does not operate as a traditional web crawler. It runs a series of audits against a given page to generate a report on performance and accessibility.

Key facts

Operator
Google
Family
Google
Purpose
Site Owner Fetch
User-Agent
Chrome-Lighthouse
Should you block it?
Depends
Verified
Yes
Respects robots.txt
No
Identity type
Unknown
Confidence
Medium
Last verified
2026-04-01
Last checked
2026-04-01

Bot details

Identity

User-Agent
Chrome-Lighthouse
robots.txt token
Chrome-Lighthouse
HTTP user-agent
Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5 Build/MRA58N) AppleWebKit/537.36(KHTML, like Gecko) Chrome/69.0.3464.0 Mobile Safari/537.36 Chrome-Lighthouse

Ownership

Operator
Google
Family
Google
Type
Scraper
Purpose
Site Owner Fetch

Verification and trust

Source type
Unknown
Confidence
Medium
Last verified
2026-04-01
Last checked
2026-04-01
Verification
Validate the identifying user-agent or signature against the operator documentation before creating hard allow rules.
Spoofing risk
User-agent strings can be spoofed. For allow-listing or low-friction rules, pair the published identifier with operator documentation or cryptographic verification when available.

Blocking and detection

Robots

User-agent: Chrome-Lighthouse
Disallow: /

Cloudflare

(http.user_agent contains "Chrome-Lighthouse")
Advanced server rules

Apache

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} Chrome\-Lighthouse [NC]
RewriteRule ^ - [F,L]

Nginx

if ($http_user_agent ~* "Chrome-Lighthouse") { return 403; }

Notes

It runs a series of audits against a given page to generate a report on performance and accessibility.

Known user-agent patterns: Chrome-Lighthouse Known user-agent strings: Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5 Build/MRA58N) AppleWebKit/537.36(KHTML, like Gecko) Chrome/69.0.3464.0 Mobile Safari/537.36 Chrome-Lighthouse Robots.txt handling in the directory: no.

Operator documentation: https://developers.google.com/web/tools/lighthouse

Evidence and source

  • Validate the identifying user-agent or signature against the operator documentation before creating hard allow rules.
  • Chrome-Lighthouse is an automated, open-source tool for auditing web page quality and does not operate as a traditional web crawler. It runs a series of audits against a given page to generate a report on performance and accessibility.
  • User-agent strings can be spoofed. For allow-listing or low-friction rules, pair the published identifier with operator documentation or cryptographic verification when available.