The ‘Techie’ Stuff → Crunching Code
This one’s for the robots—those creepy search engine crawlers analyzing and recording every line of code on your website before they report back to their masters…
Search Bot #6434: (robot voice) Hel-lo, I am pleased to report that Site X featured clean code, concise meta tags with relevant keywords, an up-to-date sitemap file, and most of all, a helpful robots.txt letting me know exactly which pages to crawl and which ones to skip. Be back soon!
Google: (godly voice) Thank you, Search Bot #6434. I will be sure to take that into consideration next time someone searches for “used book stores” near Madison, Wisconsin.
SEO Files & W3C Compliance
Definition: Optimized page files that comply with the latest web standards and adhere to best practices for search engine cataloging. No extraneous code.
Includes: Code on every page of a website, sitemap and robots.txt files.
FMC Plan: Scrub code to remove errors and/or browser incompatibilities. Create or update a sitemap file. Depending on the scope of your website, you may require multiple sitemap files. Create or update robots.txt file to exclude pages that don’t need to be crawled.
Definition: Code updates that tell search engines what your website and individual website pages are about.
Examples: Title tags, meta tags, meta descriptions, image tags, image descriptions—present on every page.
FMC Plan: We tackle on-page optimization in batches. Depending on how many website pages you have, our initial update of title tags and meta data may take a few months. After we’re done, we monitor keyword rankings and adjust accordingly. Like most of our strategies, on-page optimization is an ongoing process.
Beyond the Code…
Code optimization benefits search engines as well as users. Sites with clean code tend to load faster and render properly in all browsers.