The robots.txt is a text file stored on the website's server. It can be used to leave instructions for crawlers (e.g. which pages they should not visit or index). This way you can make sure certain pages won't be displayed in the search results.

It is important to realize that the instructions in the robots.txt are rather suggestions than mandatory instructions. While most crawlers will follow these instructions, some might ignore them. Furthermore, even a backlink from another website could be enough to index a page, although you don't want that.