First, you will need to create a robots.txt file and configure it to allow/disallow content depending on your needs.
  1. Download the robots.txt file from your FTP site and save it locally on your computer.
  2. Configure the robots.txt file to allow/disallow content depending on your needs.
  3. Save and upload the changed robots.txt file to your FTP.
Note: changes made to robots.txt will take some time depending on when Google next decides to crawl your site.

Here is some more information on how to access your FTP server (BB735774).