Hi everyone,
I am Aditya Shende (Kong) from India. A Bounty Hunter , Biker and Researcher.
This is my 13th article , If you found any spell error. Let it be….. Lets start
JavaScript (.js) files serve as repositories for client-side code and can function as the fundamental framework of websites, particularly in contemporary contexts. With the evolution of technology, I have observed an increasing prevalence of substantial data stored within .js files on websites. When inspecting the source code of a website, it is common to encounter references to main.js and app.js, which correspond to ReactJS. These websites heavily rely on JavaScript and employ Ajax requests. These files encompass comprehensive information about the application, while also utilizing specific and distinct JavaScript files for each endpoint, as depicted below.
.js files are frequently underestimated due to their inclusion of intricate and unfamiliar code, which may appear nonsensical. However, by employing targeted keyword searches, valuable information can be extracted. Over time, as one familiarizes themselves with JavaScript and comprehends its workings, a clearer understanding of these files and their functionality will emerge. This is where an understanding of code and proficiency in JavaScript proves invaluable in the pursuit of bug bounties.
Locating .js files:
The process of finding .js files is relatively straightforward. One approach is to right-click on the web page and select “view source” (or visit view-source:https://www.website.com/). Then, you can search for occurrences of “.js” within the HTML code. This method is suitable for manual hackers, as it allows you to identify .js files that exclusively contain code relevant to the specific endpoint you are exploring. In that case, you may come across a file named “config.js,” which is specifically associated with this endpoint. This file might unveil new API endpoints that were previously unknown to you.
When employing Burp Suite’s spidering functionality, you will encounter numerous .js files, which should be subjected to further investigation. Additionally, as mentioned earlier, if the target website utilizes ReactJS, you are likely to encounter main.js and app.js files.
The items you should be searching for include:
- New Endpoints: Look for any references to new API endpoints within the JavaScript code. These endpoints might provide additional functionalities or access to specific features that are not available through the web application’s user interface.
- New Parameters: Pay attention to any new parameters being utilized in the JavaScript code. These parameters may allow you to manipulate or customize the behavior of the application.
- Hidden Features: Sometimes, the JavaScript code may contain sections or functions that are not exposed in the web application’s interface. These hidden features could potentially provide additional functionality or access to premium-only features. Determine if you can interact with these features even without a premium account.
- API Keys: Look for any occurrences of API keys within the JavaScript code. These keys may grant access to restricted APIs or sensitive data. Make sure to handle them securely and avoid exposing them.
- Developer Comments: Explore the JavaScript code for any developer comments, such as single-line (//) or multi-line (/* */) comments. These comments may reveal valuable information about the code, such as the date of publication or updates.
grep -r -E "aws_access_key|aws_secret_key|api key|passwd|pwd|heroku|slack|firebase|swagger|aws_secret_key|aws key|password|ftp password|jdbc|db|sql|secret jet|config|admin|pwd|json|gcp|htaccess|.env|ssh key|.git|access key|secret token|oauth_token|oauth_token_secret" /path/to/directory/*.js
Make sure to replace /path/to/directory
with the actual path to the directory where your .js files are located. The command will recursively search for the specified keywords in all .js files within that directory.
Please note that it's important to exercise caution and follow ethical guidelines when performing searches like this, ensuring you have proper authorization to access and analyze the files.
The provided command consists of multiple components and is used to discover subdomains, identify JavaScript files (with HTTP response status 200), and save the results in separate files. Here’s an explanation of each part of the command:
subfinder -d domain.com
: This command utilizes the tool called Subfinder to discover subdomains of the specified domain (domain.com
). Subfinder is a subdomain discovery tool that uses various sources to find subdomains associated with a domain.| httpx -mc 200
: The pipe (|
) symbol is used to pass the output of the previous command as input to the next command. In this case, the output of thesubfinder
command is passed tohttpx
. Thehttpx
command is used to send HTTP requests and filter the responses with a status code of 200 (successful response).| tee subdomains.txt
: Again, the pipe (|
) symbol is used to pass the output of the previous command, which contains the discovered subdomains with a 200 status code, to thetee
command.tee
is a command-line utility that allows you to both display the output on the screen and save it to a file. In this case, it saves the subdomains to a file namedsubdomains.txt
.cat subdomains.txt | waybackurls
: Thecat
command is used to read the contents of thesubdomains.txt
file. The output is then passed as input to thewaybackurls
command.waybackurls
is a tool that retrieves historical URLs from the Wayback Machine, which is an archive of web pages. This command helps in finding URLs that were previously available but may not be currently accessible.| httpx -mc 200
: Similar to the previous usages, the pipe (|
) symbol passes the output fromwaybackurls
tohttpx
, which filters URLs with a status code of 200.| grep .js | tee js.txt
: Thegrep .js
command filters the URLs to only include those containing the ".js" extension (JavaScript files). The pipe passes these URLs totee
, which saves them to a file namedjs.txt
. Thetee
command allows displaying the output on the screen while simultaneously saving it to the file.
In summary, this command sequence combines various tools (subfinder
, httpx
, waybackurls
, and grep
) to find subdomains, retrieve historical URLs, filter JavaScript files, and save the results in separate files (subdomains.txt
and js.txt
).
The complete command is as follows:
subfinder -d domain.com | httpx -mc 200 | tee subdomains.txt && cat subdomains.txt | waybackurls | httpx -mc 200 | grep .js | tee js.txt
Please ensure that you replace domain.com
with the actual domain you want to search.
Now you can grep for this : cat js.txt | grep -r -E “aws_access_key|aws_secret_key|api key|passwd|pwd|heroku|slack|firebase|swagger|aws_secret_key|aws key|password|ftp password|jdbc|db|sql|secret jet|config|admin|pwd|json|gcp|htaccess|.env|ssh key|.git|access key|secret token|oauth_token|oauth_token_secret”
Once you get the JS URLs you can use nuclei exposures tag on it get more sensitive information .
To run a Nuclei command on the js.txt
file with the exposures
tag, you can use the following command:
nuclei -l js.txt -t ~/nuclei-templates/exposures/ -o js_exposures_results.txt
Here’s an explanation of each part of the command:
nuclei
: This is the command to run Nuclei, a fast and customizable vulnerability scanner.-l js.txt
: The-l
flag specifies the file (js.txt
) containing the list of URLs to scan with Nuclei.-t ~/nuclei-templates/exposures/
: The-t
flag specifies the path to the Nuclei templates directory for theexposures
tag. Adjust the path~/nuclei-templates/exposures/
to match the actual path where your Nuclei templates are stored.-o js_exposures_results.txt
: The-o
flag is used to specify the output file (js_exposures_results.txt
) where the scan results will be saved. You can replacejs_exposures_results.txt
with the desired output file name.
Make sure you have Nuclei and the relevant templates (in this case, templates related to exposures) installed and configured properly before running the command. Adjust the paths and filenames according to your specific setup.