DEV Community

Ben Dunlea
Ben Dunlea

Posted on

learning javaScript: i built a robots.txt generator

So I'm learning JavaScript, and I decided to create a simple but useful tool: a robots.txt generator. This tool allows users to customize their robots.txt file, which tells search engine crawlers which pages to crawl and which to ignore. Here's a breakdown of how I built it:

Setting Up the HTML Structure
The first step was to create the basic HTML structure. I included a form with various input elements for users to specify their preferences:

A dropdown menu to set the default behavior for all robots (either Allow or Disallow).
A dropdown menu to specify a crawl-delay value.
An input field for the sitemap URL.
A section to add specific user agents and their corresponding paths.
A section to add restricted directories.
A button to generate the robots.txt file.
Adding JavaScript Functionality
To make the form interactive, I wrote JavaScript functions to handle the following tasks:

Adding User Agents: A function to dynamically add user-agent input sets, each with a dropdown to select a search bot and an input field for the path to disallow.
Adding Restricted Directories: A function to add input fields for specifying directories to be disallowed.
Generating the Robots.txt File: A function to compile the user inputs into a properly formatted robots.txt file and display it in a textarea.
Downloading the File: A function to download the generated robots.txt file.
Example Functions
Here's a snippet of the JavaScript code that adds user-agent input sets:

function addUserAgent() {
    const userAgentsDiv = document.getElementById('userAgents');
    const uaDiv = document.createElement('div');
    uaDiv.className = 'user-agent';

    const uaSelect = document.createElement('select');
    uaSelect.className = 'userAgentName';
    const searchBots = ['Google', 'Bing', 'Yahoo', 'Baidu'];
    searchBots.forEach(bot => {
        const option = document.createElement('option');
        option.value = bot;
        option.textContent = bot;
        uaSelect.appendChild(option);
    });

    const pathInput = document.createElement('input');
    pathInput.type = 'text';
    pathInput.placeholder = '/path-to-disallow/';
    pathInput.className = 'path';

    const removeButton = document.createElement('button');
    removeButton.textContent = 'Remove';
    removeButton.type = 'button';
    removeButton.onclick = function() { userAgentsDiv.removeChild(uaDiv); };

    uaDiv.appendChild(uaSelect);
    uaDiv.appendChild(pathInput);
    uaDiv.appendChild(removeButton);
    userAgentsDiv.appendChild(uaDiv);
}

Enter fullscreen mode Exit fullscreen mode

Conclusion
Building this robots.txt generator was a great exercise in JavaScript. It helped me understand how to dynamically create and manage form elements, handle user input, and generate files for download.

If you want to follow along more of what I'm doing you can here: Hatch

Top comments (0)