11 Technical Factors for Successful SEO
I have found that there are two kinds of search engine optimization professionals: those with a marketing background and those with a penchant for coding.
SEO marketers are typically decent writers and often times very talented content strategists. They are capable of deftly incorporating keywords into site content that is both appealing to visitors and attractive to search engines. What golden grills and platinum chains are to the hip hop industry, high PageRank links and golden triangle rankings are to the SEO practice. The SEO marketing types are typically knowledgeable about the best method for attaining both – the creation and distribution of compelling, optimized site content.

Technical SEOs, on the other hand, break down barriers that would otherwise thwart a great SEO campaign. They can spot impending danger where most of us just see endless slashes, braces, semicolons and other alpha-numeric combinations interlaced throughout. SEO technicians eliminate indexing obstacles and open the lines of communication between the websites and the Google. Some of these people can write, but it’s been my experience that most prefer to leave all that to writers.
One such member of the technical SEO camp is Oliver Tani. I’ve worked with Oliver on a number of different SEO projects and have been awe-struck by his talent. He helped me fully understand the potential issues websites can cause Google and the other engines when all they want do is add your site to their index. A list of 11 technical factors that can affect the way in which Google picks up your website follows.
(WARNING: I may get a bit nerdy with the verbiage in the checklist below. If this information doesn’t make sense, contact me. I’m happy to explain further.)
1. Canonicalization
When you enter your website address into a browser as https://www.yoursite.com does it render? When you enter your website address into a browser as https://yoursite.com (without the www) does it also render? If both the www and non-www version come up without one redirecting to the other, you have a canonicalization problem. Search engines view your site as two different sites, which can cause a duplicate content issue. This can often come up when you have multiple domains for a single site. Use a 301 redirect to solve this problem and to stop confusing Google.
2. Server Status
This one is normally a “gimme” as you will know most often when there is a problem with your server. To check the status of your server, you can use a header check tool. The result you are looking for here is “200 OK”. Common error statuses are 400 – Bad Request, 404 – Not Found, and 500 – Internal Server Error. Unresponsive servers or those with errors make it difficult on search engines to find and index your content.
3. Sitemap
When websites were first built, sitemaps were used as a tool for navigation similar to an office lobby directory or the Yellow Pages. Since then, visitors have warmed to present-day, conventional navigation structure. However, sitemaps still serve a purpose for SEO. They provide search engines a handy list of all the pages you have on the website. Sitemaps should use text links of descriptive terms (i.e. keywords) for each page. Be careful not to include too many links on one page (to be covered in #9), and break the sitemap into pieces if necessary. Sites should also include an XML sitemap to allow easy indexing with Google Webmaster Tools.

4. Robots.txt
Think of a Robots.txt file as either a Welcome mat or a “Beware of Dog” sign for your website. A robots.txt file should be present to let search engines know that they are more then welcome to index your site. You can also use this file to tell search engines to stay away from certain pages or directories that you may not want coming up in search results.
5. Friendly 404 Pages
Websites change. Some pages are added. Nearly all are edited at some point. A few may go away permanently. For those pages that are removed, Google may not always get the memo right away. An old or dead page may still be included in Google’s index. To avoid having a potential or returning customer receive an ugly error page, it is recommended to build a custom 404 page. These pages offer a visitor friendly transition to other relevant pages on your site. Here are a few good samples.
6. External Scripting
Ever sit down to watch a movie that has way too many previews? If so, you know how Google feels when it finds a site that does not use external scripting. Sites that include all the JavaScript, CSS and other useful code in an external file are making Google work to hard to find all that it should be indexing easily. Large chunks of JS and CSS can be condensed into single line of code by using external scripting. Give Google the ability to skip to the main feature – your HTML, keyword-rich content.
7. Frames
Frames allow site owners to show text and graphics within a scrollable, embedded window on a page. Unfortunately, this outdated content presentation technique is still used by sites that are simply old school. Frames present a SEO problem in that they don’t allow Google to view and index content within the frame. Eliminate frames to accentuate important content on your web pages.
8. Dynamic URLs
Here’s an example of a search engine friendly URL string: www.yoursite.com/directory/page.htm
Here’s an example of a dynamic URL string:
www.yoursite.com/pages/index.html?group=8d4x?action=293?id=ke029
The search engines have evolved to handle dynamic URLs much better than they did in previous years. However, it is still recommended to clean up URL strings as much as possible. The more convoluted the URL, the harder Google has to work to find and index the page. If possible, use keywords in your URL strings to entice search engines further.

9. Excessive Links
Earlier, I recommended inclusion of a sitemap. These are very important to a website’s ability to get indexed properly. Conversely, sitemaps and other web pages that include an excessive amount of links can be a red flag for search engines. Pages with more than 100 links may appear to Google and others as Link Farms. These pages are classified as those that divvy out links just for the sake of divvying out links. Bad news if you are a website owner trying to garner rankings. Additionally search engines may not index all the links on your page if you have too many. The more links you have on a page, the less authority or power that particular page carrying the links has with Google.
10. Preferred Coding
If there is a common theme here, it is this: make the search engine’s job a piece of cake. Certain coding practices that use complex table structure or are not compliant with W3C standards can be problematic. Clean, CSS-based code is preferred when developing sites for SEO success. This method uses the least amount of code possible to display text and images on a page that you would like to be indexed. In other words, eradicate all the extra coding as noted in #6 and give Google the content it needs to rank your site appropriately.
11. Duplicate Content
Duplicate content primarily stems from an old SEO trick gone bad. Developers would create dozens and in some cases hundreds of pages on a site just for search engines. These pages would be nearly identical except for their keyword target. For example, Page A would use “tow truck company” in a repetitive manner and that phrase would be replaced by “tow truck services” on Page B. All other page elements and text would remain the same. The smart kids at Google picked up on this hack and subsequently outlawed duplicate content. Site owners should be careful to eliminate and/or modify pages that present seemingly similar information. 301 redirects should be implemented in cases where multiple URLs point to the same page content (e.g. yoursite.com, yoursite.com/home.htm, and yoursite.com/main.htm all show the same content but render as separate pages).
If you are a site owner or marketing manager, does your site pass with flying colors? If you are a technical SEO professional, are there other important factors for which you check?
Chris, I really dig this post, just like all of your previous posts on TTFD. On the subject of sitemaps, do you recommend any XML sitemap generation tool in particular? I have used GSiteCrawler in the past, and I believe I have had good results with it. I’d really appreciate your perspective, when you can spare a moment.
Thanks, Grant. Looking forward to your show in early April. I typically use this one. Set the priority to 0.5 for everything except for your Home page (0.8) and maybe two or three other key pages if it makes sense. No frills, just basic build, upload and verify. For enterprise ecommerce sites with lots of content and changes, I’ve heard good things about KeyLimeTie’s tool. Only drawback is that it isn’t free but it’s probably well worth it if you have the need. There is also a nice list of tools at XMLsitemap.com. WordPress also has a good plugin that autoupdates the sitemap with every post. Hope that helps.
Wow. Thank you, Chris! Once the show’s up, I’ll be experimenting with your suggestions, for certain.
This is a great post, Chris. You’re so right about the challenges of SEO for somebody like me — a marketer, not a coder, but working through SEO for my small-biz clients nonetheless. This post was extremely helpful. Thank you!
Excellent. Glad you found if valuable. Thanks for commenting.
A real pleasure to meet you yesterday. I’ve published a lot of the fixes you requested and we’ll see how the next 15 to 30 days go.
Good meeting you too, Andy. Thanks for your feedback. I hope to see things improve for your site in time.
Thank you for this beautiful post about search engine optimization….i really got knowledge…
Chris,
I feel like you leave no stone unturned and add so many details that help experience and new comers alike. Thanks for sharing your knowledge!
Thanks, Camille. I appreciate the feedback.
Good post Chris.
I would also add:
Make sure all dynamic content is indexable / readable by bots. – and that the site is 100% functional with scripts and plugins disabled.
It’s amazing how many sites rely on ajax and other content which is loaded after a page is loaded without specifying this data in such a way that it can be accessed by bots/users/screenreaders etc. Even the Apple.com website search wasn’t accessible with scripts turned off last time i audited it a while back.
Cheers
Niall
Great addition. Thanks for adding that.
really very attractive tips thanks for post
Good checklist to analyse some of the technical optimization aspects of a website. I would also add: 1) do a browser check to see if the website looks OK in all browsers and 2) add breadcrumbs to provide a way for search engines to quickly crawl all the pages in the website.
Good additions, James. Thanks.