Some Internet marketing managers just don't want hot leads to visit their website. I conclude this after hearing that website owners actually call search engine customer service departments complaining that users are daring to enter sites directly on pages they're most interested in. These callers would prefer search engines that link users only to the homepages and never to pages inside the site.
Ticketmaster even filed a lawsuit to get other websites to stop sending users directly to interior pages where users could buy tickets to specific shows. In contrast to these cases, most websites treasure such direct and deep links, and pay dearly for
that encourage other sites to send them targeted traffic.
These sites understand what our
study of e-commerce usability
showed: Difficulties in
getting from the homepage to the correct product page
27% of the failures
(averaged across 20 sites). On average, better usability can
an e-commerce site's sales, so preventing deep linking would eliminate about a quarter of the potential sales from visitors arriving from search engines. These visitors are the hottest leads because of their current, specific interest in your products; you really don't want to lose their business.
Deep linking enhances usability
because it is more likely to satisfy users' needs.
, such as links to a company's homepage, are less useful than
that take users to an individual article or product.
Now that we finally have effective Web-wide search engines, users will likely use them more often to locate deep links to a specific solution, rather than starting out at a company's homepage.
Supporting Deep-Link Users
A website is like a house with a million entrances: the front door is simply one among many ways to get in. A good website will accommodate visitors who choose alternate routes.
Here are three guidelines for enhancing usability for users who enter your site at interior pages:
Tell users their arrival point, and how they can proceed
to other parts of the site by including these three design elements on every single page:
Company name or logo in upper left corner
Direct, one-click link to the homepage
Search (preferably in the upper right corner)
Orient the user
relative to the rest of the website. If the site has hierarchical information architecture, a breadcrumb trail is usually the best way to do this. Also, include links to other resources that are directly relevant to the current location. Don't bury the user in links to all site areas or to pages that are unrelated to their current location.
Don't assume that users have followed a drill-down path
to arrive at the current page. They may not have seen information that was contained on higher-level pages.
An example of the third point: I was recently researching a flat-panel monitor at a computer vendor's site. I got there by searching Google for the monitor model, after reading a positive review in a magazine. Once there, I could not find a link to the specs. Nowhere. Later in the session, I clicked the breadcrumb for the category page with all the monitors. There, I found the spec sheet, which was in an
file that contained the specs for all the monitors scattered across a brochure. This was bad enough, let alone that this essential selling tool was only available on the category page, not the product pages.
When to Avoid Deep Links
In a few cases, deep links are counter-productive because certain pages cannot or should not be used before users have passed through higher-level pages.
An example from my own site: I have an exercise that asks readers of my homepage guidelines to
evaluate the improvements in the recent redesign of the BBC's homepage
. The exercise consists of two pages:
an initial page that explains the problem
a subsequent page that provides the solution
If these two pages are viewed in anything but the proper sequence, the entire exercise is spoiled. You can never look at a design with fresh eyes once you know where the usability problems are located. Thus, anybody who followed a deep link to the page with the solution would not benefit from the exercise.
It's easy to prevent search engines from deep linking to a specific page. Simply include the following meta-tag code in the HEAD part of the page:
<META NAME="robots" CONTENT="noindex">
Well-behaved search engines will exclude any such page from their databases.
Deep linking from non-search sites and from misguided search engines can be prevented by some fancy server-side programming that checks whether a user has been to the appropriate higher-level page before entering the sensitive page. I don't recommend this, however, because of the high likelihood of getting it wrong and preventing access by legitimate users. If a third-party website is so stupid that it links to a useless destination, then it's probably such a bad site that it has very little traffic anyway.
Deep linking is your friend: It
gets users to their preferred destination
as quickly as possible. Thus, you should only use the "noindex" meta-tag for pages that users should never visit first.
Share this article: Twitter | LinkedIn | Google+ | Email