ExpertXSS is a Python-based web security tool for authorized testing of Cross‑Site Scripting (XSS) exposure.
It uses a two‑phase approach to rapidly locate reflections and then probe with context‑aware payloads. It can crawl a target, fuzz forms/params, test header-based reflections, and produce JSON/CSV/HTML reports.
⚠️ Legal: Use only on systems you own or have explicit permission to test.
ExpertXSS automatically fetches fresh payloads from PayloadBox’s XSS list and rotates realistic User‑Agents from a popular gist.
Compared to simple “reflect and alert(1)” scripts, ExpertXSS adds:
- Two-phase strategy: fast reflection discovery (marker-only) → targeted XSS payload tests
- Crawler (respects
robots.txtby default;--ignore-robotsto override) - Form fuzzing: submits forms, preserves hidden fields, injects values
- Header injection trials:
Referer,X-Forwarded-For,X-Api-Version, etc. - Heuristics: detects raw/HTML/URL/double‑URL reflections & infers HTML/Attr/JS contexts
- Security headers audit: CSP, X-Content-Type-Options, Referrer-Policy, etc.
- Reports: pretty HTML, plus JSON and CSV for pipelines
- Networking: proxy/SOCKS, retries with backoff, rate limit, concurrency, custom headers/cookies
- Windows-friendly: color output via
colorama; no POSIX-only calls
- Dynamic Payload Retrieval — Pulls the latest payloads with conditional GET (ETag/Last‑Modified) caching.
- User-Agent Rotation — Randomizes a large UA pool to vary requests.
- Concurrent Scanning — Multi-threaded workers with request rate limiting.
- Lightweight WAF Check — Looks for common WAF telltales; can be disabled with
--no-waf-check. - GET/POST/FORM Injection — Injects into query/body params and discovered forms.
- Header Injection — Optional checks for reflections via common headers.
- Save Results — Export to
--output results.json,--csv results.csv, and--html report.html.
- Python 3.8+
- Packages:
requests,beautifulsoup4,colorama,tqdm,lxml,html5lib\n Optional:requests[socks]for SOCKS proxies.\npip install requests beautifulsoup4 colorama tqdm lxml html5lib # optional (SOCKS): pip install 'requests[socks]'
-
Clone this repository
git clone https://github.com/Masriyan/ExpertXSS.git cd ExpertXSS -
(Optional) Create a virtual environment
python -m venv venv # Linux/Mac source venv/bin/activate # Windows venv\Scripts\activate
-
Install dependencies
pip install -r requirements.txt # if provided # or pip install requests beautifulsoup4 colorama tqdm lxml html5lib
-
Run the tool
python ExpertXSS.py -h
# Single target (GET)
python ExpertXSS.py -u "https://example.com/search?q=test" --concurrency 8 --max-payloads 200 --html report.html
# Crawl and fuzz
python ExpertXSS.py --crawl -u https://example.com --max-pages 200 --max-depth 2 --concurrency 10 --html report.html
# Authenticated POST with headers/cookies and proxy
python ExpertXSS.py -u https://example.com/login -m POST \
--data "user=demo&pass=demo" \
--header "X-Client: audit" --cookie "sid=abc123" \
--proxy http://127.0.0.1:8080 --test-headers \
--max-payloads 150 --html report.html| Flag/Option | Description | Default |
|---|---|---|
-u, --url |
Target URL (seed for crawl or single target). | — |
-m, --method |
HTTP method for direct mode: GET or POST. |
GET |
-p, --param |
Param name for single-target mode if URL has none. | q |
--data |
POST body template for single-target POST mode, e.g., a=b&c=d. |
— |
--crawl |
Enable crawler to discover URLs & forms. | off |
--max-pages |
Maximum pages to crawl. | 150 |
--max-depth |
Maximum crawl depth. | 2 |
--allow-external |
Follow external domains (default: same host only). | off |
--ignore-robots |
Ignore robots.txt (default: respect). |
off |
--timeout |
HTTP timeout (seconds). | 12 |
--rate |
Requests per second cap (0 = unlimited). |
0 |
--retries |
Retry attempts on network errors. | 2 |
--backoff |
Backoff factor between retries. | 0.7 |
--proxy |
Proxy URL (e.g., http://127.0.0.1:8080 or socks5h://127.0.0.1:9050). |
— |
--proxy-list |
File with a list of proxies (one per line). | — |
--header |
Custom header, e.g., --header "X-Client: audit" (repeatable). |
— |
--cookie |
Cookie string, e.g., sid=abc123; theme=dark. |
— |
--encode |
Encode injected values: url, base64, double-url, html. |
— |
--delay |
Delay between requests (seconds). | 0.0 |
--concurrency |
Concurrent workers. | 6 |
--max-payloads |
Limit number of payloads per param (speeds up scans). | all |
--test-headers |
Try header-based reflection (e.g., Referer, X-Forwarded-For). |
off |
--no-waf-check |
Skip WAF detection. | off |
--payload-file |
Custom payload file path. | — |
--log |
Log file path. | — |
--output |
Save findings to JSON file. | — |
--csv |
Save findings to CSV file. | — |
--html |
Save a pretty HTML report. | — |
- Basic usage
python ExpertXSS.py -u "https://example.com"- Concurrent scanning
python ExpertXSS.py -u "https://example.com" --concurrency 5- Use a proxy & POST method
python ExpertXSS.py -u "https://example.com/vuln" -m POST -p "search" --proxy "http://127.0.0.1:8080"- Save to JSON/CSV/HTML
python ExpertXSS.py -u "https://example.com" --output results.json --csv results.csv --html report.html- Skip WAF check
python ExpertXSS.py -u "https://example.com" --no-waf-check- JSON (
--output results.json) — Full finding objects, suitable for pipelines. - CSV (
--csv results.csv) — Flat table for spreadsheets and quick diffing. - HTML (
--html report.html) — Self-contained, pretty report including evidence and security headers.
Each finding includes: URL, method, location (query/body/form/header), parameter, reflection style (raw/html/url/double-url), inferred context (HTML/Attr/JS), status code, response security headers, and an evidence snippet.
Below is a demonstration of ExpertXSS in action:
Released under the MIT License. See LICENSE for details.
Maintainer: Sudo3rs
