Search Engine Optimisation Interview

for Yahoo;

1: Outline the roles of people you work with during the course of an average week and detail the type of work undertaken for example:

  1. with internal sales team – providing SEO tutorials & guidance
  2. client marketing executives – Marketing strategy meetings
  3. client engineering – Server config recommendations


  1. Web developers – briefings (web design, web architecture, URL structures, redirects...)
  2. Clients/client’s marketing directors – strategy meetings, progress reports, research findings and proposals
  3. Account managers – making sure the clients are happy, chasing developers and 3rd parties
  4. Sales – providing guidance, SWOT analysis

2: Outline your experience working with large databases / content management systems.

3: Outline your experience working with any web analytics / stat packages and reporting tools.

3 years of analysis using Urchin, Google analytics, Indextools, Hitwise, PPC data, the Yahoo keyword tool and Google trends.

Researching the search market gaps and filling them in: content creation opportunities, link-building/content creation strategies of competitors, understanding the searching habits and trends in many different e-commerce niches.

4: Outline your experience working with content syndication or affiliation schemes.

I’ve successfully used Google product several times on a small scale (up to 2000 products) and created several scripts to generate feeds from the product databases.

I don’t syndicate content of my clients to affiliates. My current job is to get content to search results using SEO strategies. It is often one of my goals to decrease costs clients need to pay to affiliates.

5: Outline three problems that typically arise when attempting to SEO dynamic, multiple language, CMS driven sites.

  1. Keyword research in foreign languages.
  2. Making sure the templates (views) us the best possible structure
  3. Yahoo crawler unable to recognize unique content within infinite URL spaces will not index unique content – there are associated web architecture , content categorisation & URL routing issues.

Non-navigational links such as: links to related products, tag based navigation etc. don’t work well with Yahoo’s current algorithm. Only tree navigation structures are applicable.

  1. Issues with changes in URL structures when rewriting is implemented or amended. Keeping track of internal 301 redirects. Complex .htaccess files.
  2. Duplicated content issues when the same chunks of information are pulled from the database repeatedly on different URLs – keyword cannibalisation issues associated.

6: Outline three typical SEO problems one can face when attempting to track user-clicks of individual lead generating links.

If a JavaScript link with tracking hashes is used (such as Kelkoo’s):

  1. This link will not pass value.
  2. Search Engines do not understand JavaScript so the link is not followed. Good for rankings of Kelkoo’s pages. Value of pages is not drained away.
  3. Bad for popularity in the circles of SEO wise web-marketers

7: Outline three typical SEO problems one can encounter when syndicating content to 3rd party affiliates

  1. Duplicated content problems – Search engines will use natural language processing methods such as w-shingling to determine the similarity of two documents. The problem occurs when the affiliate is considered the original resource of information. E.g.: If two documents are too similar, Google will in most scenarios pick the document with higher PR value and the original document will be outranked or buried in search results.
  2. Affiliates can outrank the original source of information (That’s the point of affiliates) and harvest the organic search traffic, there would be usually associated expenses for leads/clicks generated by the affiliates.
  3. Affiliate could misuse the value of content and advertiser’s own content can become his search market competition. Sometimes the syndicated content might even end up rehashed using Markov chain as a link farm for different projects of the affiliate.

8: Provide an example URL of the last SEO project you have worked upon, together with one paragraph detailing the problems addressed.

Last project:

One line of code ensured five-fold increase in organic search traffic over one month. How? Search friendly permanent redirects were used instead of temporary redirects for all the product lists. We are also separating the search functionality controller from the category/tag navigation controllers to remove infinite URL spaces making the site more Yahoo friendly.

9: What would you say is/are the main technical problem(s) affecting the potential SEO performance of the following sites in local country indices: - -

Question left unanswered.

10: What is the biggest challenge/issue facing the long term SEO performance of product comparison sites?

SEO agencies, SEO contractors – strategic web marketers understand that they will have higher sustainable source of traffic if they don’t feed their content to 3rd parties or affiliates such as product comparison sites.

Corporate web marketers prefer to optimize their own DB driven systems to reap the search traffic themselves rather than constantly “flushing” the budget towards affiliates. It is probably getting harder to get web marketers to feed their content to 3rd parties as they understand the duplicated content issues. There is also a strong competition from the players such as Google product.

It is getting easier to create DB driven sites with tools such as Rails/Django and competition in the long tail part of search market is slowly saturating as well.

11: Under what circumstances would you use the following:

  1. noscript – used if a user-agent (e.g a crawler) doesn’t understand javascript. Can be used to put in crawlable content which is the equivalent of javascript content.
  2. index, follow – You want that page to be index and the links to be followed from that page.
  3. nofollow – can be part of rel=”nofollow” attribute telling crawlers not to follow links (used to stop flow of page rank through those links – implemented by wikipedia and by a lot of forums and blogs in their user generated content sections.
  4. [R,L] – flags used in apache mod-rewrite RewriteRule code – R means redirect i.e R=301 and L meaning last.
  5. * / - is this referring to a robots.txt command to disallow or allow a user agent access to the whole of a site?


  1. I would not use noscript these days anymore. DOM/AJAX is the preferred method. Accessible content can be used in any element not just noscript. DOM will be used to replace the content with its interactive alternative if the scripting is available in the user-agent. Can be useful to hide tracking pixels without complex coding.
  2. There is no need to use index, follow. This is the default behaviour of any search engine. Search engines index pages and follow the links. Again, no use for this.
    • Can be used to sculpt the internal link-flow & sculpt the flow of the value of internal links. E.g. ensure the search engines will not pass value to the site-wide links unimportant from the SEO perspective such as personalisation features of the website, T&Cs, other language versions, mobile version etc.
    • Stopping the drainage of the page value to external resources.
    • Discourage link spamming in user-generated content.
  3. E.g.: RewriteRule ^cv /f/cv.html [L,R=301]
    I use it so that I can easily spell URLs on the phone. In everyday SEO practice I use it to redirect old version of the URIs to its new counterparts.
  4. (5) * / - does not make sense without “Allow:” or “Disallow:” directives