Google crawlers are designed from the beginning in a way to understand simple HTML codes, but when your mega menu created in complex code (Ajax or some other) will always be ignored though tried Fetch tool. Why it considers simple HTML code is that it needs things on your page simple for the users. Perhaps, complex code could be the culprit here, but wouldn’t cause any problem for your whole SEO practices and gaining good ranks.

Google should have no problem crawling mega menu assuming it is implemented with nested unordered lists (<ul> elements) containing anchor (<a>) elements in the various list items (<li>). If you’re doing something really weird like using complex AJAX calls or something to build them out then that might be an issue.

As per our knowledge the Fetch & Render tool in Search Console emulates what Google is now doing during the indexing process to determine what content is visible to users on initial page load. But Google also know what parts of the page are template (header, footer, left nav/sidebar if applicable, right sidebar if applicable, breadcrumbs) and what part of the page is the main body/content section. They fully expect that the top navigation might utilize dropdowns and that those dropdowns will always be hidden on initial page load. Just because they do not show in Fetch & Render does not mean that they are not being crawled for discovery and counted.

but if those webpages referenced inside the mega menu have been indexed and can be found in the SERP’s for the keyword phrases they’re targeting, then just because fetch and render doesn’t display them (within a menu) would not be an immediate cause of concern.

Apart from this make your menus logical and hierarchical if there are a lot of categories – users should be able to grasp the structure of the site from the way the menu is structured. They’ll feel more comfortable and stay on the site longer.

Google has been searching through JavaScript for over a decade to find URLs for discovery purposes. But in the last half decade or so they have gotten extremely good at actually executing JavaScript and its derivatives like AJAX. However, not all search engines are as adept at doing so. So if you want your navigation to be crawlable by all engines, it is recommended that you implement it with HTML. All search engines were designed to process HTML documents.

You can try at Weblogs to check where GoogleBot and other crawlers have been also Try awstats or Splunk.

Enquiry Now


This will close in 0 seconds