Weidert Group
Case Studies

 

Weidert Group's Growth-Driven Design Website Case Study

April 2018

Learn how Growth-Driven Design (GDD) works by reviewing Weidert Group’s experience implementing this methodology on our own website.

Our Challenge

Before Homepage

Weidert Group’s website was attracting a growing amount of traffic, but not functioning the way we wanted it to: the user experience (UX) was not as intuitive as it could be, and cumbersome when the site was being viewed on mobile. And while the site attracted and converted leads, most were not from the target industries we want most. Finally, the site had been fairly static for 3 years, with only minor changes to pages outside of our daily blog. The design was due for a refresh, as the analytics and our judgment indicated its effectiveness was waning.

Learn more about how to implement GDD for yourself — view our GDD webinar!

The Solution

Though our typical approach would have been to completely overhaul our website, we weren’t able to commit the time necessary to go through a traditional website development process (one start-to-finish sequencing of work). Instead, we opted to leverage Growth-Driven Design (GDD) principles and methods, with the expectation that we would be able to:

  • Create an initial website (a “launchpad” site) within a relatively short time frame
  • Distribute our budget (time/talent) evenly across a year (with a modest investment for the initial site, and smaller ongoing requirements for iterative, analytics-informed changes)
  • Evolve toward a site leveraging user data to make continual updates as we learned, resulting in an “always optimized site” state

After Homepage-1

The Process

Strategy Phase

  • We first re-identified our best prospects and updated the associated buyer personas. To ensure that we would be speaking primarily to our best prospects, we redefined our target audiences, prioritizing Industrials (and the businesses that serve them) as our focus for design and content
  • We developed a new branding strategy (colors, fonts, graphics/photo/visual approach, etc.) to more accurately communicate our focus
  • We created site goals and a wish list of modifications. Our goals for the new site were to:
    • Capture 13 marketing-qualified leads (MQLs) and 4 sales-qualified leads (SQLs) each month
    • increase visitor sessions to 35,000 per month
    • increase new contacts to 296 per month
    • increase number of new customers to 1 per month

The wish list was a fairly exhaustive itemization of all the strategic and tactical features stakeholders wanted as part of the new site and included changes or additions such as a speaker’s bureau page, a cost calculator, expanded team bios, a process page outlining how we work with clients and much more

Launchpad Build Phase

  • The next step was to begin planning a launchpad site – a site that delivered all the essential functionality and content to visitors and that could then be improved on an ongoing basis.
  • We first prioritized the wish list based on what we believed would help most in reaching our goals, and then we outlined the input/work we’d need from team members and made assignments tied as part of a detailed timeline. Not all wish list items were executed; those that remained after launch were kept on the wishlist for review during continuous improvement cycles
  • New content (copy and graphics, plus functionality) was created for web pages and the launchpad site was built. This phase took just 6 weeks, compared to an average of approximately 5 months for a traditional site build for the same number of pages
  • The site was launched on time, on September 21

 

Continuous Improvement Cycles

  • In that first iteration of the site, judgment rather than data guided our decisions. In all subsequent cycles, though, data was/is used to determine what changes need to be made to improve user experience and goal metrics
  • The wish list items not executed for the launchpad site were presented at the first monthly GDD meeting, along with other suggestions people on the team felt would improve the site. These were prioritized and associated work assigned to team members
  • A similar process is followed in each cycle; our cycles are 4-5 weeks, usually around 20 work days.

GDD Results

As we mentioned in our Strategy phase we identified goals for sessions, contacts, MQLs, SQLs and customers. In the first 6 months we blew our goals out of the water.  We couldn’t be more excited to see the growth we have achieved not to mention all the things we have learned by implementing this process. Taking this approach, opposed to a set it and forget it approach, has taught us so much about how our visitors use our site, the things they are interested in, the things that drive people to stay on a page longer and so many other insights we would have never had if we hadn’t taken a GDD approach to our site.  

KPIs

Metrics from Changes Made in Continuous Improvement Cycles

Since launching our launchpad site in September 2017 we have completed 6 continuous improvement cycles as of April 2018. Through those cycles we have learned a lot about what the users on our site are interested in. Below are some changes made during our continuous improvement cycles that contributed to our phenomenal growth.

Updating the Consultation Landing Page

Original

  • Too much text
  • Included site navigation that allowed/encouraged visitors away from the landing page. (Example: “Meet our team” button)

Before Free Consultation page

Modifications Made

  • Hypothesis: By taking away the navigation and adding video to the page we would increase the time on page and conversation rate
  • Turned the text into a video – a far more engaging way to communicate to the audience
  • Removed the landing page navigation

After Free Consultation Page

 Consultation_pg_Results

Removed the header banner on our blog’s mobile view

Original

  • A large (“tall”) header banner forced mobile users to scroll quite a bit before getting to the content they were looking for, leading to high page exit rate

Modifications Made

  • Hypothesis: By eliminating the header banner on our blog we would increase increase time on page and decrease the exit rate
  • We removed the banner from the mobile version of the site. The improvement in number of “eyes” that reach content lower on the page is seen in the Hotjar heatmap captures below. The warmer the color, the more views that object or section has received from visitors

 

We removed the banner from the mobile version of the site. The improvement in number of “eyes” that reach content lower on the page is seen in the Hotjar heatmap captures above. The warmer the color, the more views that object or section has received from visitors

Executed a mobile-centered design to improve the mobile users’ experience (launch to-date compared to prev. period)

Original

  • Since over 85% of our traffic originated on desktop, mobile was not made a top priority

Modifications Made

  • Hypothesis: By enabling a mobile first design our mobile traffic would grow along with keeping people on our site on their mobile devices
  • Since launch, we have see enormous page view growth along with time on page growth
Mobile-centered_Design_Results


Redesigned Call-to-Action Buttons

All our advanced content CTAs were updated to be more visually interesting and to feature a headline that was more clear about the value of the advanced content.

Hypothesis: By making the headline more clear we believed more people would begin to click on our CTAs. 

Annual-Marketing-Plan-CTA-Performance


Step-by-step-CTA-performance

Additional Takeaways

  • Most changes we make to our site begin as experiments – “Could we get more clicks on this link if we removed it from the body copy?” “Would a new design for this chart get more eyes on it?” “Will a cost calculator get visitors closer to requesting a demo?” Once we make a modification to the site, the data gathered from users’ visits tells us whether or not our experiment worked.
  • Experiments themselves are based on informed assumptions. We create experiments using what we know about our prospects, the data we already have collected and our knowledge of what people expect from websites.
  • Not all of those experiments work, but they all teach us something! One example is a form strategy update we made to a landing page conversion form. After adding questions that required a visitor to select a drop-down option, we saw a decrease in the number of visitors filling out the form. This was a good indication that visitors don’t want to provide that level or amount of information.
  • Our wish list is based on judgements we make about what we think visitors will find useful or interesting. We added a page to our site that explains how we work with our clients because prospects have occasionally asked, “How does a typical inbound marketing engagement with you work?” User data will tell us if they’re spending time on that page (and how long, what areas they’re clicking on, where they go from this page and more).
  • Metrics take time. Your website’s backend doesn’t automatically spit out numbers in a neat, usable package. But once gathered and evaluated, they have the power to transform your website into a highly effective lead attraction and conversion machine.

Since our launchpad site went live in September 2017 we have continuously made updates and have seen enormous success from doing so. If you want to learn more about what a Growth Driven Design engagement might look like for your company, reach out to us. We’d love to hear from you! 

Curious About Inbound Marketing? Request a free consultation and let's talk about how we can grow your business



Topics: Case Study, Web Design and Development, Website Design



Is inbound right for you?

Answer the questions on the Are We a Fit page to help you decide if you’re ready to proceed.

are we a fit?