Chapter 7. Technical Rewrites for Search Engines
In previous chapters, we have looked at some of the standard adjustments we can make to our .htaccess
and robots.txt
files in order to either speed up our website or completely block search engine access to our development site.
In this section, we'll be looking at a few crucial adjustments that we can make to these files in order to further enhance our protection over duplicate-content-related penalties and to reduce the number of unnecessary pages being cached by search engines.
The .htaccess
and robots.txt
files that we are editing in this section exist on the root directory of our Magento website.
In this chapter, we will be learning how to:
Use advanced techniques in the
.htaccess
file to maintain URL consistency and redirect old URLs to their newer equivalentsImprove our
robots.txt
file to disallow areas that should not be visible by search enginesUse an observer to block duplicate content issues on filtered category pages