Search engines use automated little programs, called bots or spiders, to scan the content of web pages and break it down into relevant words and phrases. Then, they add those words and phrases to an index. This process is called indexing. Bots routinely index millions of web pages and build massive indexes.
When a person uses a search engine to find a word or a phrase, the search engine looks up its index and generates a list of web pages on which the word or the phrase resides. The entries in this list are called search results. If the person searches for a word or a phrase, which is present on one of your web pages, it could appear in his search results.
However, the same words and phrases on your web pages are likely to be present on hundreds of thousands of pages on other websites as well. A search engine has to take an educated guess at which of those pages match the context of a user's query better than others and list them in the search results, in the order of their relevance...