PortSwigger Academy - Unprotected Admin Functionality
This lab has an unprotected admin panel. Solve the lab by deleting the user carlos.

❓ This lab has an unprotected admin panel.
Instructions
To solve the lab:
- Delete the user
carlos
.
Discovery Process
This was a quick and satisfying lab to complete. I began with a directory scan and looked for commonly misconfigured files using a custom tool called common_findings.py. You can also use other scanners like DirBuster and Gobuster.
During the scan, I discovered a publicly accessible robots.txt
file at:
https://<target-lab>.web-security-academy.net/robots.txt
Upon visiting this file, I found the following lines:
User-agent: *
Disallow: /administrator-panel
What is robots.txt ?
The robots.txt file provides instructions to web crawlers (a program that automatically browses websites to collect and index content) about which directories and files should not be accessed or indexed.
User-agent: *
means these rules apply to all web crawlers.Disallow: /administrator-panel
instructs bots not to crawl or index the/administrator-panel
path.
It’s important to note that these rules do not apply to human users, they’re only guidelines for bots. This makes robots.txt
a common spot to uncover hidden or sensitive areas of a website.
Exploiting the Hidden Admin Panel
Knowing that the /administrator-panel
path exists, I manually navigated to it by appending it to the lab’s base URL:
https://<target-lab>.web-security-academy.net/administrator-panel
I was granted access without any authentication.
Inside the panel, I found a list of users:
Users:
- wiener [Delete]
- carlos [Delete]
To complete the lab, I simply clicked the Delete button next to the carlos
user.
Lab Complete!
By accessing a directory hidden via robots.txt
, I was able to discover an unprotected admin panel and delete the specified user. This highlights the importance of:
- Not relying on
robots.txt
for security. - Properly protecting sensitive areas of a website with authentication mechanisms.
Mitigations
To prevent similar security flaws in real-world applications, consider the following best practices:
- Do Not Rely on robots.txt for Security - The robots.txt file is intended for search engine crawlers, not for enforcing access control. Sensitive directories or administrative panels should never be hidden solely using this mechanism.
- Implement Authentication and Authorization Controls - All sensitive areas, such as admin panels or internal dashboards, should be protected using strong authentication (e.g., session-based login or multi-factor authentication). Role-based access controls should ensure that only authorized users can perform administrative actions.
- Conduct Access Control Testing - Regularly test access control mechanisms to ensure unauthorized users cannot access sensitive endpoints. Use tools and manual checks to attempt access without authentication or with low-privilege accounts.
- Avoid Security Through Obscurity - Security should not rely on hiding paths or resources. All sensitive endpoints should assume they are discoverable and be secured accordingly.
- Monitor and Log Access to Sensitive Endpoints - Maintain logs for access to administrative paths and monitor for any unauthorized access attempts. Early detection of suspicious behavior can help mitigate damage.
- Security Awareness and Code Review - Developers should be trained to understand the implications of exposing administrative functions without proper protection. Peer reviews and secure coding practices should be part of the development lifecycle.