I’ve gone through enough website launches to know how stressful they can be. You spend a ton of time thinking about the design, building the architecture and individual pages, and writing content that will attract the right visitors. The last thing you want to see after all that work is for your traffic to drop off a cliff.
Even with the best redesigns, it’s not unusual to see a slight dip in organic traffic when you launch. This is because you have a new site structure that Google and the other search engines need to reindex. That is, one of their search crawlers examines your website's content and logs it in their database. You can minimize the disruption in your search results, however, so let’s take a look at four key considerations when optimizing your website for Google’s web crawlers.
Depending on the content you're replacing, indexing can take anywhere from a few hours to a few weeks. If your site has been static for a long time or doesn’t get much traffic, it takes longer for Google to reindex it. When you launch a new website there are a few tasks you can do to help get your site re-indexed faster.
A sitemap is a structured file that Google can read to understand your website's architecture. XML-Sitemaps.com is a good resource to use that can create your sitemap for you. It will scan the first 500 pages of your site for free, and you can purchase their standalone version if you have a larger site. Once you have your sitemap you need to submit it to Google via Google Search Console. Creating a Google Search Console account and adding your website to it should be part of your new website launch checklist as it offers many useful tools such as a backlink checker and search analytics. Consider using HubSpot to create a sitemap XML file for Google Submission.
If you’re updating an existing site, it’s vital to update your sitemap and, as part of that process, it’s important that your website and sitemap protocols match so the web crawlers have less trouble indexing.
Another thing you can do in Google Search Console is request Google to recrawl your website. With this tool you submit individual URLs for Google to crawl along with direct links to those pages. Take this step as soon as possible; Google provides no timetable for accommodating the request.
Google will increase the frequency with which it recrawls a page if it recognizes that the content on that page is updated regularly and is more popular with visitors. For instance, a news website with hourly updates will be recrawled often while an out-of-date business website with no new content and low traffic will be recrawled very seldom.
You can use this to your advantage. If you already have some other highly trafficked or updated pages, such as a blog on a subdomain, you can point a link on that page to your new website to help Google’s web crawlers find it faster. It’s important to remember that your linking needs to be relevant. If visitors navigate to the linked pages and then bounce, your rankings will suffer because of unrelated content.
These steps can help limit a drop off in web traffic. Typically, the drop in traffic is not more than 5% and shouldn’t last longer than a few weeks. But if your traffic does drop greater than that or fails to recover in a month, then you need to start analyzing your site for issues that could be causing the drop in visitors. Identifying these problems can be challenging, but using tools like Moz Site Crawl and Google Search Console can streamline the search for errors and other problems.
So what if decreasing the index time didn’t help? Try these next steps and re-evaluate.
When you change the structure of your site, redirects tell your visitors where the content they once found in another location has been relocated. For instance, if you used to have your contact page at www.mycompany.com/about-us/contact but now have it at www.mycompany.com/contact-us, a redirect will send your visitors to the new URL when they try to visit the old one. The old authority of your pages will also be transferred to the new pages on your website. If you don’t set up redirects, frequent visitors will start getting errors when they visit. These errors tell search engines that you have bad content, and ding your site rankings.
A good practice is to perform a site audit of your old site before launching a new one to map out pages and put the redirects in place. If that’s a step that was skipped and you see a drop in your traffic, a site audit is highly recommended.
The error codes that your website returns when visitors are directed to non-existent content can affect your website’s rankings as well. There are three primary codes that tell the story of what’s happening on your site.
Error code 404 — the website you were trying to reach couldn't be found on the server
Error code 301 — the resource or page being requested has been permanently moved
Error code 302 — the resource or page has been temporarily moved to a new location (this error will cause overall SERP to drop)
Error code 200 — means the resource was found, but it tells Google your site has a significant amount of duplicate content
It's important to keep tabs on the status of your website. Google Search Console and other monitoring platforms are great for making sure your links are not broken and diagnosing problems as they come up. With the proper site audits, redirects and sitemaps in place, the impact of the launch on your website's SEO should be minimal. Add in the right support, and you should see improved traffic growth over your old website design.
Before you launch your site, a comprehensive SEO strategy should be developed to help drive content creation and technical SEO maintenance. That strategy and resulting maintenance should be an ongoing part of your overall website support plan.
It’s common to panic when you see a drop in traffic after a new site launch or the update of an existing site. But a few key considerations can help limit or even prevent the drop off, help traffic rebound and get you back to achieving your goals quickly.